An emotion recognition method based on multimodal data fusion
In the field of human-computer interaction,endowing machines with the ability to recognize and understand human emotional states has become a key issue.Physiological signals,as a direct reflection of human physiological activities,provide an effective way to objectively evaluate emotional states.The emotion recognition technology based on multimodal physiological sig-nals is receiving widespread attention from researchers.This study proposes a deep learning architecture based on the Convolu-tional Neural Network(CNN)framework,which effectively extracts spatiotemporal features related to emotions from various yiphysiological electrical signals.This model can comprehensively utilize emotional information from multimodal data to achieve more accurate and detailed emotional state recognition.We conducted testing experiments on the multimodal emotion database DEAP.The experimental results showed that the model surpassed the baseline model in both emotion recognition tasks,which not only validated the effectiveness of our proposed model but also demonstrated its significant advantages over traditional models.
deep learningemotional recognitionmultimodalphysiological signals