Abstract
Knowledge distillation (KD) transferring knowledge from a large teacher
model to a lightweight student one has received great attention in deep
model compression. In addition to the supervision of ground truth, the
vanilla KD method regards the predictions of the teacher as soft labels
to supervise the training of the student model. Based on vanilla KD,
various approaches have been developed to improve the performance of the
student model further. However, few of these previous methods have
considered the reliability of the supervision from teacher models.
Supervision from erroneous predictions may mislead the training of the
student model. This article therefore proposes to tackle this problem
from two aspects: label revision to rectify the incorrect supervision
and data selection to select appropriate samples for distillation to
reduce the impact of erroneous supervision. In the former, we propose to
rectify the teacher’s inaccurate predictions using the ground truth. In
the latter, we introduce a data selection technique to choose suitable
training samples to be supervised by the teacher, thereby reducing the
impact of incorrect predictions to some extent. Experiment results
demonstrate the effectiveness of the proposed method, which can be
further combined with other distillation approaches to enhance their
performance.
| Original language | English |
|---|---|
| Pages (from-to) | 1377-1388 |
| Number of pages | 12 |
| Journal | IEEE Transactions on Cognitive and Developmental Systems |
| Volume | 17 |
| Issue number | 6 |
| Early online date | 11 Apr 2025 |
| DOIs | |
| Publication status | Published - Dec 2025 |
UN SDGs
This output contributes to the following UN Sustainable Development Goals (SDGs)
-
SDG 9 Industry, Innovation, and Infrastructure
User-Defined Keywords
- Image Classification
- Knowledge Distillation
- Lightweight Model
Fingerprint
Dive into the research topics of 'Improve Knowledge Distillation via Label Revision and Data Selection'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver