首页|Context-aware generative prompt tuning for relation extraction

Context-aware generative prompt tuning for relation extraction

扫码查看
Abstract Relation extraction is designed to extract semantic relation between predefined entities from text. Recently, prompt tuning has achieved promising results in the field of relation extraction, and its core idea is to insert a template into the input and model the relation extraction as an masked language modeling (MLM) problem. However, existing prompt tuning approaches ignore the rich semantic information between entities and relations resulting in suboptimal performance. In addition, since MLM tasks can only identify one relation at a time, the widespread problem of entity overlap in relation extraction cannot be solved. To this end, we propose a novel Context-Aware Generative Prompt Tuning (CAGPT) method which ensures the comprehensiveness of triplet extraction by modeling relation extraction as a generative task, and outputs triplets related to the same entity at one time to overcome the entity overlap problem. Moreover, we connect entities and relations with natural language and inject entity and relationship information into the designed template which can make full use of the rich semantic information between entities and relations. Extensive experimental results on four benchmark datasets demonstrate the effectiveness of the proposed method.

Xiaoyong Liu、Handong Wen、Chunlin Xu、Zhiguo Du、Huihui Li、Miao Hu

展开 >

Guangdong Polytechnic Normal University||Guangdong Polytechnic Normal University

Guangdong Polytechnic Normal University

South China Agricultural University

2024

International journal of machine learning and cybernetics

International journal of machine learning and cybernetics

EISCI
ISSN:1868-8071
年,卷(期):2024.15(12)
  • 43