Medical dialogue text generation based on dynamic federated distillation
To address the knowledge misleading problem caused by data heterogeneity in traditional federated distillation,a model for generating medical dialogue texts through dynamic knowledge fusion and clients selec-tion for distillation,FedKS,is proposed,providing a more detailed modeling for knowledge accumulation and transmission procedure.Firstly,an effective knowledge aggregation mechanism is designed in federated learn-ing.Secondly,to address the issue of misleading knowledge,a threshold-based method is proposed to opti-mize the updating selection of local model on each client.By calculating the performance gain of the global model on each client,it is determined whether to adopt the local model after knowledge distillation.FedKS model can effectively address the problem of local model performance degradation caused by misleading knowledge,thereby achieving efficient knowledge aggregation and transmission.Experiments based on multi-ple benchmark datasets show that FedKS model accelerates training convergence speed and improves the per-formance compared with the existing baselines.In addition,it can support heterogeneous client models.