Improving Knowledge Distillation via Transferring Learning Ability
2023-04-24Code Available0· sign in to hype
Long Liu, Tong Li, Hui Cheng
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/brilliantcheng/slkdOfficialIn paperpytorch★ 0
Abstract
Existing knowledge distillation methods generally use a teacher-student approach, where the student network solely learns from a well-trained teacher. However, this approach overlooks the inherent differences in learning abilities between the teacher and student networks, thus causing the capacity-gap problem. To address this limitation, we propose a novel method called SLKD.