Research on Depression Detection Algorithm Based on Facial Motion Features Extracted by Vision Sensor
Although significant progress has been made in automatic diagnosis systems for depression,most of the work focuses on combi-ning features from multiple modalities to improve classification accuracy,which generates a lot of space-time overhead and feature synchro-nization problems.A unimodal depression detection framework based on facial expression and facial motion features is proposed.Firstly,a robust feature extraction method based on the ratio of facial landmark is proposed and it is theoretically proved that this feature has up-down,left-right translation,depth translation,rotation,and flip invariance.The features extracted based on the proposed method maintain the topological structure relationship of facial landmarks in space and maintain the temporal correlation of frames before and after facial landmarks.Then,a novel idea is provided to solve the classification task of large-unit depression videos.The final depression classification result is obtained by decomposing the depression classification task of large-unit videos into the scoring task of multiple short-sequence u-nits and then through the defined score aggregation function.On the DAIC-WOZ dataset,the proposed detection framework improves the classification performance,with an F1 score of 0.85,outperforming other current unimodal-based depression detection models.