Optimization of Soft Prompt Vectors Based on LSTM and Position Enhancement
Soft prompt learning is an emerging method for applying pretrained language models.However,the vectors generated by soft prompt learning may lack sequential structure,affecting the model's ability to define information at specific positions,resulting in impaired model performance.To address this,we delve into the sequential structure of soft prompt vectors and their influence on model performance.It was found that soft prompt vectors exhibit sequence sensitivity issues across different types of language models,model sizes,types of downstream tasks,and prompt lengths.In response,we propose a soft prompt sorting network based on LSTM and position enhancement.Firstly,an improved LSTM network is used for soft prompt sorting optimization,where a prompt selection gate is added at each gate to capture sequence information and generate well-ordered soft prompt vectors.Secondly,a position enhancement module is proposed for the sorting process,optimizing the order by combining absolute and relative position information.Tests on the GLUE dataset show that the proposed method brings an average performance improvement of 3.1%compared to baseline.