Abstract
Data-free knowledge distillation (DFKD) transfers knowledge from a teacher to a student without access the real in-distribution (ID) data. Its common solution is to use a generator to synthesize fake data and use them as a substitute for real ID data. However, existing works typically assume teachers are trustworthy, leaving the robustness and security of DFKD from untrusted teachers largely unexplored. In this work, we conduct the first investigation into distilling nontransferable learning (NTL) teachers using DFKD, where the transferability from an ID domain to an out-of-distribution (OOD) domain is prohibited. We find that NTL teachers fool DFKD through divert the generator’s attention from the useful ID knowledge to the misleading OOD knowledge. This hinders ID knowledge transfer but prioritizes OOD knowledge transfer. To mitigate this issue, we propose Adversarial Trap Escaping (ATEsc) to benefit DFKD by identifying and filtering out OOD-like synthetic samples. Specifically, inspired by the evidence that NTL teachers show stronger adversarial robustness on OOD samples than ID samples, we split synthetic samples into two groups according to their robustness. The fragile group is treated as ID-like data and used for normal knowledge distillation, while the robust group is seen as OOD-like data and utilized for forgetting OOD knowledge. Extensive experiments demonstrate the effectiveness of ATEsc for improving DFKD against NTL teachers. Code is released at https://github.com/tmllab/2025_ICML_ATEsc.
| Original language | English |
|---|---|
| Title of host publication | Proceedings of the 42nd International Conference on Machine Learning, ICML 2025 |
| Publisher | ML Research Press |
| Pages | 23603-23639 |
| Number of pages | 37 |
| Publication status | Published - Jul 2025 |
| Event | 42nd International Conference on Machine Learning, ICML 2025 - Vancouver Convention Center, Vancouver, Canada Duration: 13 Jul 2025 → 19 Jul 2025 https://icml.cc/Conferences/2025 (Conference Website) https://icml.cc/virtual/2025/calendar (Conference Calendar) https://proceedings.mlr.press/v267/ (Conference Proceedings) |
Publication series
| Name | Proceedings of Machine Learning Research |
|---|---|
| Publisher | ML Research Press |
| Volume | 267 |
Conference
| Conference | 42nd International Conference on Machine Learning, ICML 2025 |
|---|---|
| Country/Territory | Canada |
| City | Vancouver |
| Period | 13/07/25 → 19/07/25 |
| Internet address |
|
UN SDGs
This output contributes to the following UN Sustainable Development Goals (SDGs)
-
SDG 16 Peace, Justice and Strong Institutions
Fingerprint
Dive into the research topics of 'When Data-Free Knowledge Distillation Meets Non-Transferable Teacher: Escaping Out-of-Distribution Trap is All You Need'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver