高技术通讯(英文版)2024,Vol.30Issue(1) :13-22.DOI:10.3772/j.issn.1006-6748.2024.01.002

SHEL:a semantically enhanced hardware-friendly entity linking method

QI Donglin(亓东林) CHEN Shudong DU Rong TONG Da YU Yong
高技术通讯(英文版)2024,Vol.30Issue(1) :13-22.DOI:10.3772/j.issn.1006-6748.2024.01.002

SHEL:a semantically enhanced hardware-friendly entity linking method

QI Donglin(亓东林) 1CHEN Shudong 1DU Rong 1TONG Da 1YU Yong1
扫码查看

作者信息

  • 1. Institute of Microelectronics of Chinese Academy of Sciences,Beijing 100029,P.R.China;University of Chinese Academy of Sciences,Beijing 100190,P.R.China
  • 折叠

Abstract

With the help of pre-trained language models,the accuracy of the entity linking task has made great strides in recent years.However,most models with excellent performance require fine-tuning on a large amount of training data using large pre-trained language models,which is a hardware thresh-old to accomplish this task.Some researchers have achieved competitive results with less training da-ta through ingenious methods,such as utilizing information provided by the named entity recognition model.This paper presents a novel semantic-enhancement-based entity linking approach,named se-mantically enhanced hardware-friendly entity linking(SHEL),which is designed to be hardware friendly and efficient while maintaining good performance.Specifically,SHEL's semantic enhance-ment approach consists of three aspects:(1)semantic compression of entity descriptions using a text summarization model;(2)maximizing the capture of mention contexts using asymmetric heuristics;(3)calculating a fixed size mention representation through pooling operations.These series of seman-tic enhancement methods effectively improve the model's ability to capture semantic information while taking into account the hardware constraints,and significantly improve the model's conver-gence speed by more than 50%compared with the strong baseline model proposed in this paper.In terms of performance,SHEL is comparable to the previous method,with superior performance on six well-established datasets,even though SHEL is trained using a smaller pre-trained language model as the encoder.

Key words

entity linking(EL)/pre-trained models/knowledge graph/text summarization/semantic enhancement

引用本文复制引用

基金项目

Beijing Municipal Science and Technology Program(Z231100001323004)

出版年

2024
高技术通讯(英文版)
中国科学技术信息研究所(ISTIC)

高技术通讯(英文版)

影响因子:0.058
ISSN:1006-6748
参考文献量31
段落导航相关论文