首页|'Self-Balancing Mixture Of Experts' in Patent Application Approval Process (USPT O 20240273346)
'Self-Balancing Mixture Of Experts' in Patent Application Approval Process (USPT O 20240273346)
扫码查看
点击上方二维码区域,可以放大扫码查看
原文链接
NETL
NSTL
The following quote was obtained by the news editors from the background informa tion supplied bythe inventors: “Machine learning models using mixture-of-expert (MOE) techniques are typically made upof N number of layers which are broadly classified as MOE layers and non-MOE layers. Various distributionstrategies are used to distribute large MOE machine learning models into computing system hard ware.