In federation learning,different users'devices have very different computing,communication,and storage capabili-ties,which are prone to unfair problems such as dropouts,posing a major challenge to the existing federation learning.To solve this problem,federated learning adaptive encoders(FedAE)was proposed,in which different combinations of encoders were sent to users for local updates according to the performance of their devices,and the corresponding encoders were parameter aggrega-ted on the server side.By such allocation on demand,all users'devices were enabled to give full play to their own device per-formance,the fairness was ensured.FedAE performed classification processing by cascading classifiers,which improved the overall accuracy of the model,and enabled high-performance users to get satisfactory accuracy results early and exit early,saving computational resources.By experimentally comparing accuracy and convergence,FedAE provides a more suitable solution for the device heterogeneity problem.