首页|基于BERT与细粒度特征提取的数据法学问答系统

基于BERT与细粒度特征提取的数据法学问答系统

扫码查看
首先利用bidirectional encoder representations from transformers(BERT)模型的强大的语境理解能力来提取数据法律文本的深层语义特征,然后引入细粒度特征提取层,依照注意力机制,重点关注文本中与数据法律问答相关的关键部分,最后对所采集的法律问答数据集进行训练和评估.结果显示:与传统的多个单一模型相比,所提出的模型在准确度、精确度、召回率、F1分数等关键性能指标上均有提升,表明该系统能够更有效地理解和回应复杂的数据法学问题,为研究数据法学的专业人士和公众用户提供更高质量的问答服务.
Data law Q&A system based on BERT and fine-grained feature extraction
A data legal question and answer system was proposed based on bidirectional encoder representations from transformers(BERT)model and fine-grained feature extraction to provide accurate professional legal consulting services.Firstly,the powerful contextual understanding ability of the BERT model was leveraged to extract deep semantic features from data legal texts.Subsequently,a fine-grained feature extraction layer was introduced which mainly focused on key components related to data legal Q&A within the text using an attention mechanism.Finally,the collected legal Q&A dataset was trained and evaluated.The results indicated that compared to traditional multiple single models,the proposed model showed improvements in key performance indicators such as accuracy,precision,recall,and F1 score,which suggested that the system could more effectively comprehend and address complex issues in data law,providing higher quality Q&A services for both research data law professionals and the general public.

bidirectional encoder representations from transformers(BERT)modelfine-grained feature extractionattention mechanismnatural language processing(NLP)

宋文豪、汪洋、朱苏磊、张倩、吴晓燕

展开 >

上海师范大学信息与机电工程学院,上海 201418

上海交通大学电子信息与电气工程学院,上海 200240

bidirectional encoder representations from transformers(BERT)模型 细粒度特征提取 注意力机制 自然语言处理(NLP)

上海市科学仪器领域项目教育部重大项目国家自然科学基金

2214220190020JZD02062301320

2024

上海师范大学学报(自然科学版)
上海师范大学

上海师范大学学报(自然科学版)

影响因子:0.255
ISSN:1000-5137
年,卷(期):2024.53(2)
  • 10