Emotion-conditioned Music Generation Based on Reinforcement Learning-guided Pre-trained Model
Generating music with specific emotions is an important subtask of controllable music generation.Previous supervised learning methods rely on emotion-annotated music datasets and suffer from the issue of inconsistent training objectives and model optimization goals.In this paper,we propose a reinforcement learning guided approach for emotion-based music generation,using a pre-trained symbolic music emotion classification model to score the generated music as feedback for optimizing the GPT-2-based autoregressive music generation model.This method overcomes the limitation of annotated datasets and enables training models for generating music with emotions on unlabeled symbolic music datasets with similar musical genres and data types.Objective and subjective evaluation results demonstrate that our proposed method can generate high-quality music matching the specified emotions.
music generationpre-trained modelreinforcement learningmusic emotion