Contrastive Learning-based Prompt Generation Method for Large-scale Language Model Reverse Dictionary Task
Reverse dictionary task is an emerging task that aims to find the corresponding word based on a given definition.Large-scale language models offer new possibilities for this task,but the quality of the prompt sentences affects the performance of the large models.To this end,this paper proposes a contrastive learning-based prompt generation method.This method extracts definition semantics from multiple semantic levels.It also enhances the model's generalization ability by incorporating negative examples through contrastive learning.With this method,we can narrow down the target word to a small range,and use a large model to select the most semantically consistent word from this range.Experimental results show that the proposed method can effectively improve the performance of large-scale language models on the reverse dictionary task.The prompt generation model has a 94.7%probability of generating a range that contains the target word.The large-scale language model has a 58.03%pro-bability of directly selecting the target word,and a 74.55%probability of including the target word when five candidate words are given.
Reverse dictionaryLarge-scale language modelContrastive learningMultiple semantic scalesContrastive loss