Skip to content

VipinMI2024/EmoTune-AI

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

EmoTune AI - Emotionally Expressive Music Generator

Powered by Vipin

EmoTune AI is a deep learning project that generates emotionally expressive music based on user-specified emotions. By combining Transformer architectures and Generative Adversarial Networks (GANs), the system is capable of producing realistic, human-like melodies tailored to moods such as happy, sad, angry, relaxed, and more.


🚀 Features

  • 🎶 Emotion-Based Music Generation: Input a target emotion and generate music reflecting that mood.
  • 🤖 Transformer + GAN Hybrid Architecture: Leverages the sequence modeling power of Transformers with the creative synthesis of GANs.
  • 🧠 Human-Like Melodies: Produces high-quality, expressive compositions in MIDI or audio format.
  • 🎛️ User Control Panel: Easily customize emotion intensity, tempo, genre influences, and more.
  • 📊 Training on Emotion-Tagged Datasets: Utilizes publicly available emotion-labeled music datasets for training and evaluation.

🧠 Model Architecture

  • Emotion Encoder: A Transformer-based module that encodes the emotion input into a latent representation.
  • GAN Generator: Takes emotion vectors and random noise to generate expressive musical sequences.
  • Discriminator: Evaluates whether the music is realistic and aligned with the given emotion.

📦 Installation

git clone https://github.com/your-username/EmoTune-AI.git
cd EmoTune-AI
pip install -r requirements.txt

🛠️ Usage

from models import generate_melody

emotion = "joyful"
melody = generate_melody(emotion=emotion, tempo=120)
save_to_midi(melody, filename="joyful_tune.mid")

Or via Web Interface (Streamlit):

streamlit run app/ui_interface.py

📊 Dataset

We use emotion-labeled MIDI datasets such as:


🧪 Evaluation

  • Emotion classification accuracy of generated samples
  • Music quality metrics (tonal stability, pitch range, rhythm variance)
  • Human evaluation via listening sessions

🔍 Future Enhancements

  • Support for multi-instrument generation
  • Real-time generation with emotion sliders
  • Integration of lyrics for deeper emotional context
  • Cloud-based API and mobile app deployment

🎧 EmoTune AI – Where Deep Learning Meets Emotion in Music ✨ Powered by Vipin

About

A deep learning project for generating emotionally expressive music. Using Transformer and GAN models, it enables user-controlled music creation based on specified emotions. Aims to produce realistic, human-like melodies

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors