首页|Research Reports on Computational Intelligence from Kyushu Institute of Technology Provide New Insights (Layer Configurations of BERT for Multitask Learning and Data Augmentation)

Research Reports on Computational Intelligence from Kyushu Institute of Technology Provide New Insights (Layer Configurations of BERT for Multitask Learning and Data Augmentation)

扫码查看
Investigators discuss new findings in computational intelligence. According to news reporting from Fukuoka, Japan, by NewsRx journalists, research stated, “Multitask learning (MTL) and data augmentation are becoming increasingly popular in natural language processing (NLP). These techniques are particularly useful when data are scarce.” The news editors obtained a quote from the research from Kyushu Institute of Technology: “In MTL, knowledge learned from one task is applied to another. To address data scarcity, data augmentation facilitates by providing additional synthetic data during model training. In NLP, the bidirectional encoder representations from transformers (BERT) model is the default candidate for various tasks. MTL and data augmentation using BERT have yielded promising results. However, a detailed study regarding the effect of using MTL in different layers of BERT and the benefit of data augmentation in these configurations has not been conducted. In this study, we investigate the use of MTL and data augmentation from generative models, specifically for category classification, sentiment classification, and aspect-opinion sequence-labeling using BERT. The layers of BERT are categorized into top, middle, and bottom layers, which are frozen, shared, or unshared.”

Kyushu Institute of TechnologyFukuokaJapanAsiaComputational IntelligenceMachine Learning

2024

Robotics & Machine Learning Daily News

Robotics & Machine Learning Daily News

ISSN:
年,卷(期):2024.(Feb.7)
  • 38