首页|Investigators from Shandong University of Science and Technology Zero in on Comp utational Intelligence (Arc: a Layer Replacement Compression Method Based On Fin e-grained Self-attention Distillation for Compressing Pre-trained Language Model s)
Investigators from Shandong University of Science and Technology Zero in on Comp utational Intelligence (Arc: a Layer Replacement Compression Method Based On Fin e-grained Self-attention Distillation for Compressing Pre-trained Language Model s)
扫码查看
点击上方二维码区域,可以放大扫码查看
原文链接
NETL
NSTL
Investigators discuss new findings in Machine Learning -Computational Intelligence. According to news originating fro m Qingdao, People's Republic of China, by NewsRx correspondents, research stated , "The primary objective of model compression is to maintain the performance of the original model while reducing its size as much as possible. Knowledge distil lation has become the mainstream method in the field of model compression due to its excellent performance." Financial supporters for this research include Shandong Province Key R& D Program (Soft Science Project), Shandong Nature Science Foundation of China.
QingdaoPeople's Republic of ChinaAsiaComputational IntelligenceMachine LearningShandong University of Science and Technology