Graphic Sentiment Analysis Based on Pairwise Attention Mechanisms
Traditional sentiment analysis methods are unable to effectively handle a large amount of multimodal graphic and textual data on so-cial platforms,exposing the problem of poor performance in multimodal feature fusion.To this end,a multimodal sentiment analysis model based on dual attention mechanism fusion is established by combining attention mechanism and feedforward neural network.This model utiliz-es pre trained models to extract text and image features,strengthens public features belonging to multiple modalities using a cross modal fea-ture fusion module,extracts effective information from private features belonging to a single modality using a single modal self attention mod-ule,and finally concatenates and fuses multimodal features to achieve efficient representation of multimodal data.Validation experiments were conducted on the Twitter image and text dataset,comparing with various methods and conducting ablation experiments on internal modalities,confirming that the proposed model has good sentiment classification performance.