首页|基于生理信号的转换器模型多模态融合无扰式心理分类模型研究

基于生理信号的转换器模型多模态融合无扰式心理分类模型研究

扫码查看
目前,大学生面临严重的心理压力问题.针对心理分类问题,研究设计了基于转换器模型架构的多模态融合无扰式心理分类算法,该算法引入了多尺度注意力模块处理人脸表情和脉搏波特征.同时,采用了交互注意力机制进行特征融合.结果显示,在脉搏波的去噪处理中,基于转换器模型架构的方法所得差值的平均值为0.03,其在95%的置信区间为-1.09至1.11,且所有测试组的结果差均在此范围内,明显优于传统去噪方法.同时,研究所提算法的准确率平均高达91.36%,相比于其余算法分别提升了 17.98%、9.22%、6.79%.其召回率平均值高达88.51%,F1值平均高达90.82%.说明研究所提算法在心理分类与压力识别问题上具有较高的准确性和可靠性,其为未来的心理学研究和应用提供新的思路和方法.
The psychological classification model of Transformer based on physiological signals
Aiming at the psychological classification and stress identification of college students,the multi-modal fusion and un-disturbed psychological classification algorithm based on Transformer architecture is studied and designed.A multiscale attention mod-ule is introduced to process facial expression and pulse wave features.Meanwhile,the Interaction-Attention mechanism was adopted in the feature fusion stage and synergistic higher order feature expression was introduced in the Transformer architecture.The results show that in the denoising processing of pulse wave,the average value of the difference based on Transformer architecture is 0.03,whose 95%confidence interval is-1.09 to 1.11,and the results of all test groups are within this range,which is significantly better than the traditional denoising method.At the same time,the average accuracy of the proposed algorithm is as high as 91.36%,which is 17.98%,9.22%and 6.79%higher than that compared with the other algorithms.Its average recall rate is as high as 88.51%,and its average F1 value is as high as 90.82%.It shows that the proposed algorithm has high accuracy and reliability in psychological classification and stress identification,which provides new ideas and methods for future psychological research and application.

facial expressionpulse wavetransformer architecturemulti-modal fusionpsychological classification

白洁

展开 >

西安医学院,西安 710021

人脸表情 脉搏波 Transformer架构 多模态融合 心理分类

2021年度全省共青团和青年工作研究课题

GQTSXSW20210312

2024

自动化与仪器仪表
重庆工业自动化仪表研究所,重庆市自动化与仪器仪表学会

自动化与仪器仪表

CSTPCD
影响因子:0.327
ISSN:1001-9227
年,卷(期):2024.(8)