Chinese Abstract Meaning Representation Parsing aims to convert natural language sentences into abstract semantic representations, which is a complex structure prediction task. Traditional approaches often utilize graph features of abstract semantic representations to design specialized models or employ multi-stage parsing. However, these methods typically require the design of complex neural network models. Currently, large language models have demonstrated astonishing performance on various natural language processing tasks. In this evaluation, we attempt to directly utilize a large language model for Zero-shot learning, Few-shot learning, and fine-tuning using LoRA and full-parameter approaches. We obtain promising evaluation results and discuss these approaches in detail.
中文抽象语义表示抽象语义表示解析大型语言模型
杨逸飞、程子鸣、赵海
展开 >
上海交通大学计算机科学与工程系
中文抽象语义表示 抽象语义表示解析 大型语言模型
Chinese national conference on computational linguistics
Harbin(CN)
22nd Chinese national conference on computational linguistics (CCL 2023): evaluations