This paper proposes a multimodal emotion computation method for aviation security screening,emphasiz-ing the significance of safety in the aviation industry.The research establishes a model for alerting abnormal behavior among aviation passengers by integrating various emotional cues from passengers'multiple modes.Utilizing audiovi-sual data from the GEMEP corpus to simulate potential abnormal behaviors,the study focuses on facial expressions,body postures,and speech as three modalities for recognition.Post recognition,two decision-level fusion methods are employed:assigning weights to emotional recognition models for each modality and allocating different emotional weights within these models.This approach enables the detection of passengers'emotions such as anger,disgust,anx-iety,fear,and sadness,facilitating a graded alert system for abnormal behavior.The research findings highlight that assigning different emotional weights within the emotional recognition models yields the optimal emotional computa-tion model.This model effectively identifies relevant emotions indicating potential abnormal behavior among passen-gers,achieving an overall recognition accuracy of 82.76%.Specifically,the recognition accuracy for the emotions of anger,disgust,anxiety,fear,and sadness were 81.9%,78.5%,81.3%,83.2%,and 81.7%,respectively.