首页|Patent Application Titled "Regularizing Targets In Model Distillation Utilizing Past State Knowledge To Improve Teacher-Student Machine Learning Models" Publish ed Online (USPTO 20240062057)

Patent Application Titled "Regularizing Targets In Model Distillation Utilizing Past State Knowledge To Improve Teacher-Student Machine Learning Models" Publish ed Online (USPTO 20240062057)

扫码查看
By a News Reporter-Staff News Editor at Robotics & Machine Learning Daily News Daily News-According to news reporting originatin g from Washington, D.C., by NewsRx journalists, a patent application by the inve ntors Jandial, Surgan (Jammu, IN); Krishnamurthy, Balaji (Noida, IN); Puri, Nika ash (New Delhi, IN), filed on August 9, 2022, was made available online on Febru ary 22, 2024. No assignee for this patent application has been made. Reporters obtained the following quote from the background information supplied by the inventors: "Recent years have seen an increase in hardware and software p latforms that compress and implement learning models. In particular, many conven tional systems utilize knowledge distillation to compress, miniaturize, and tran sfer the model parameters of a deeper and wider deep learning model, which requi re significant computational resources and time, to a more compact, resource-fri endly student machine learning model. Indeed, conventional systems often distill information of a high-capacity teacher network (i.e., a teacher machine learnin g model) to a low-capacity student network (i.e., a student machine learning mod el) with the intent that the student network will perform similar to the teacher network, but with less computational resources and time. In order to achieve th is, many conventional systems train a student machine learning model using a kno wledge distillation loss to emulate the behavior of a teacher machine learning m odel. Although many conventional systems utilize knowledge distillation to train compact student machine learning models, many of these conventional systems hav e a number of shortcomings, particularly with regards to efficiently and easily distilling knowledge from a teacher machine learning model to a student machine learning model to create a compact, yet accurate student machine learning model. "

CyborgsEmerging TechnologiesMachine LearningPatent Application

2024

Robotics & Machine Learning Daily News

Robotics & Machine Learning Daily News

ISSN:
年,卷(期):2024.(Mar.8)