Skip to main navigation Skip to search Skip to main content

Improve Knowledge Distillation via Label Revision and Data Selection

  • Weichao Lan
  • , Yiu Ming Cheung*
  • , Qing Xu
  • , Buhua Liu
  • , Zhikai Hu
  • , Mengke Li
  • , Zhenghua Chen
  • *Corresponding author for this work

Research output: Contribution to journalJournal articlepeer-review

4 Citations (Scopus)

Abstract

Knowledge distillation (KD) transferring knowledge from a large teacher model to a lightweight student one has received great attention in deep model compression. In addition to the supervision of ground truth, the vanilla KD method regards the predictions of the teacher as soft labels to supervise the training of the student model. Based on vanilla KD, various approaches have been developed to improve the performance of the student model further. However, few of these previous methods have considered the reliability of the supervision from teacher models. Supervision from erroneous predictions may mislead the training of the student model. This article therefore proposes to tackle this problem from two aspects: label revision to rectify the incorrect supervision and data selection to select appropriate samples for distillation to reduce the impact of erroneous supervision. In the former, we propose to rectify the teacher’s inaccurate predictions using the ground truth. In the latter, we introduce a data selection technique to choose suitable training samples to be supervised by the teacher, thereby reducing the impact of incorrect predictions to some extent. Experiment results demonstrate the effectiveness of the proposed method, which can be further combined with other distillation approaches to enhance their performance.
Original languageEnglish
Pages (from-to)1377-1388
Number of pages12
JournalIEEE Transactions on Cognitive and Developmental Systems
Volume17
Issue number6
Early online date11 Apr 2025
DOIs
Publication statusPublished - Dec 2025

UN SDGs

This output contributes to the following UN Sustainable Development Goals (SDGs)

  1. SDG 9 - Industry, Innovation, and Infrastructure
    SDG 9 Industry, Innovation, and Infrastructure

User-Defined Keywords

  • Image Classification
  • Knowledge Distillation
  • Lightweight Model

Fingerprint

Dive into the research topics of 'Improve Knowledge Distillation via Label Revision and Data Selection'. Together they form a unique fingerprint.

Cite this