Gait Emotion Recognition Based on a Multi-scale Partitioning Directed Spatio-temporal Graph
To enhance the precision of gait emotion recognition by effectively capturing the dependencies between nodes at multiple scales, long distances, and temporal and spatial positions, a novel method comprising three parts is proposed in this paper. Firstly, a partitioned directed spatio-temporal graph construction method is proposed. It connects all frame nodes in a directed manner based on their regions. Secondly, a multi-scale partition aggregation and fusion method is proposed. This method updates the graph nodes using graph deep learning and fuses similar node features. Lastly, a Multi-scale Partition Directed Adaptive Spatio-Temporal Graph Convolutional Neural network (MPDAST-GCN) is proposed. It constructs a graph in the temporal dimension to obtain the features of distant frame nodes and learns the feature data adaptively on each frame. The MPDAST-GCN classifies input data into four emotion types: happy, sad, angry, and normal. Experimental results on the Emotion-Gait dataset demonstrate that the proposed method outperforms state-of-the-art methods by 6% in terms of accuracy.
Gait emotion recognitionEmotion recognitionGraph deep learning