首页|Researchers from Shandong Women’s University Discuss Findings in Computational I ntelligence (Feamix: Feature Mix With Memory Batch Based On Self-consistency Lea rning for Code Generation and Code Translation)
Researchers from Shandong Women’s University Discuss Findings in Computational I ntelligence (Feamix: Feature Mix With Memory Batch Based On Self-consistency Lea rning for Code Generation and Code Translation)
扫码查看
点击上方二维码区域,可以放大扫码查看
原文链接
NETL
NSTL
By a News Reporter-Staff News Editor at Robotics & Machine Learning Daily News Daily News – A new study on Machine Learning - Comp utational Intelligence is now available. According to news originating from Jina n, People’s Republic of China, by NewsRx correspondents, research stated, “Data augmentation algorithms, such as back translation, have shown to be effective in various deeplearning tasks. Despite their remarkable success, there has been a hurdle to applying data augmentation algorithms to code-related tasks since cod e consists of discrete tokens with uniqueness and certainty.” Financial support for this research came from National Natural Science Foundatio n of China (NSFC). Our news journalists obtained a quote from the research from Shandong Women’s Un iversity, “In this work, we propose FeaMix, a novel yet simple data augmentation approach designed for the feature mix with memory batch based on self-consisten cy learning. FeaMix has a couple of uniqueness. First, it specially selects the samples to be mixed by memory batch to guarantee that the generated features are in the same spatial distribution as the mixed features. Second, it extends the self-consistency learning technique to optimize the language model for code-rela ted tasks. With extensive experiments, we empirically validate that our method o utperforms several baseline models and traditional data augmentation methods on code generation and code translation.”
JinanPeople’s Republic of ChinaAsiaComputational IntelligenceMachine LearningShandong Women’s University