首页|基于多域信息融合的卷积Transformer脑电情感识别模型

基于多域信息融合的卷积Transformer脑电情感识别模型

扫码查看
当前脑电信号的情感识别方法很少融合空间、时间和频率信息,并且大多数识别方法只能提取局部的脑电特征,在全局信息关联方面存在着局限性.本文提出了一种基于多域信息融合的三维特征卷积神经网络Transformer 机制(3D-CNN-Transformer mechanism,3D-CTM)模型的脑电情感识别方法.该方法首先根据脑电信号的特性设计了一种三维特征结构,同时融合脑电信号的空间、时间以及频率信息;然后采用卷积神经网络模块学习多域信息融合的深层特征,再连接Transformer自注意力模块,提取特征信息内的全局关联性;最后利用全局平均池化整合特征信息进行分类.实验结果表明,3D-CTM模型在SEED数据集上的三分类平均准确率达到96.36%,在SEED-Ⅳ数据集上的四分类平均准确率达到87.44%,有效地提高了情感识别精度.
Convolutional Transformer EEG Emotion Recognition Model Based on Multi-domain Information Fusion
Current emotion recognition methods for eletroencephalogram(EEG)signals seldom fuse spatial,temporal and frequency information,and most methods can only extract local EEG features,resulting in limitations in global information correlation.The article proposes an EEG emotion recognition method based on 3D-CNN-Transformer mechanism(3D-CTM)model with multi-domain information fusion.The method first designs a 3D feature structure based on the characteristics of EEG signals,simultaneously fusing the spatial,temporal,and frequency information of EEG signals.Then a convolutional neural network module is used to learn the deep features for multi-domain information fusion,and then the Transformer self-attention module is connected to extract the global correlations within the feature information.Finally,the global average pooling is used to integrate the feature information for classification.Experimental results show that the 3D-CTM model achieves an average accuracy of 96.36% in the SEED dataset for triple classification and 87.44% in the SEED-Ⅳ dataset for quadruple classification,which effectively improves the emotion recognition accuracy.

electroencephalogram(EEG)emotion recognitionconvolutional neural networkTransformerself-attention

张学军、王天晨、王泽田

展开 >

南京邮电大学电子与光学工程学院、柔性电子(未来技术)学院,南京 210023

南京邮电大学射频集成与微组装技术国家地方联合工程实验室,南京 210023

脑电信号 情感识别 卷积神经网络 Transformer 自注意力

2024

数据采集与处理
中国电子学会 中国仪器仪表学会信号处理学会 中国仪器仪表学会中国物理学会微弱信号检测学会 南京航空航天大学

数据采集与处理

CSTPCD北大核心
影响因子:0.679
ISSN:1004-9037
年,卷(期):2024.39(6)