? 2021 Elsevier LtdMulti-view learning aims to fully exploit the view-consistency and view-discrepancy for performance improvement. Knowledge Distillation (KD), characterized by the so-called “Teacher–Student” (T-S) learning framework, can transfer information learned from one model to another. Inspired by knowledge distillation, we propose a Multi-view Teacher–Student Network (MTS-Net), which combines knowledge distillation and multi-view learning into a unified framework. We first redefine the teacher and student for the multi-view case. Then the MTS-Net is built by optimizing both the view classification loss and the knowledge distillation loss in an end-to-end training manner. We further extend MTS-Net to image recognition tasks and present a multi-view Teacher–Student framework with convolutional neural networks called MTSCNN. To the best of our knowledge, MTS-Net and MTSCNN bring a new insight to extend the Teacher–Student framework to tackle the multi-view learning problem. We theoretically verify the mechanism of MTS-Net and MTSCNN and comprehensive experiments demonstrate the effectiveness of the proposed methods.
Information fusionKnowledge distillationMulti-view learningTeacher–student Network
Tian Y.、Sun S.、Tang J.
展开 >
School of Economics and Management University of Chinese Academy of Sciences
School of Mathematical Sciences University of Chinese Academy of Sciences
School of Business Administration Faculty of Business Administration Southwestern University of