Heterogeneous cloud-end medical dialogue federation based on bi-directional bootstrapping distillation
A new federated learning method was proposed in the medical dialogue scene for the heterogeneous data/models and different types of data.The cloud model and the end model transferred knowledge by mutual bootstrapping distillation.The end-to-cloud bootstrapping distillation process was a multi-teacher-single-student model,and knowledge was distilled from multiple local models to a global model.The cloud-to-end bootstrapping distillation process was a single-teacher-multi-student model,and knowledge was distilled from the global model back to multiple local models.On the medical dialogue ReMeDi and MedDG data sets,the proposed method is significantly improved compared with the classical baseline by the text generation evaluation criterion,and the training speed has also been improved.