中国科学:信息科学(英文版)2024,Vol.67Issue(1) :35-46.DOI:10.1007/s11432-023-3823-6

Learnware:small models do big

Zhi-Hua ZHOU Zhi-Hao TAN
中国科学:信息科学(英文版)2024,Vol.67Issue(1) :35-46.DOI:10.1007/s11432-023-3823-6

Learnware:small models do big

Zhi-Hua ZHOU 1Zhi-Hao TAN1
扫码查看

作者信息

  • 1. National Key Laboratory for Novel Software Technology,Nanjing University,Nanjing 210023,China
  • 折叠

Abstract

There are complaints about current machine learning techniques such as the requirement of a huge amount of training data and proficient training skills,the difficulty of continual learning,the risk of catastrophic forgetting,and the leaking of data privacy/proprietary.Most research efforts have been focusing on one of those concerned issues separately,paying less attention to the fact that most issues are entangled in practice.The prevailing big model paradigm,which has achieved impressive results in natural language processing and computer vision applications,has not yet addressed those issues,whereas becoming a serious source of carbon emissions.This article offers an overview of the learnware paradigm,which attempts to enable users not to need to build machine learning models from scratch,with the hope of reusing small models to do things even beyond their original purposes,where the key ingredient is the specification which enables a trained model to be adequately identified to reuse according to the requirement of future users who know nothing about the model in advance.

Key words

artificial intelligence/machine learning/learnware

引用本文复制引用

基金项目

National Natural Science Foundation of China(62250069)

出版年

2024
中国科学:信息科学(英文版)
中国科学院

中国科学:信息科学(英文版)

CSTPCDEI
影响因子:0.715
ISSN:1674-733X
参考文献量1
段落导航相关论文