PaCKD: Pattern-Clustered Knowledge Distillation for Compressing Memory Access Prediction Models Feb 21, 2024 image-classification Image Classification
Code Code Available 0ERNIE-Tiny : A Progressive Distillation Framework for Pretrained Transformer Compression Jun 4, 2021 Knowledge Distillation
Code Code Available 0Towards Understanding and Improving Knowledge Distillation for Neural Machine Translation May 14, 2023 Knowledge Distillation Machine Translation
Code Code Available 0ERNIE 3.0 Tiny: Frustratingly Simple Method to Improve Task-Agnostic Distillation Generalization Jan 9, 2023 Knowledge Distillation Language Modelling
Code Code Available 0Data-Free Adversarial Distillation Dec 23, 2019 Knowledge Distillation Model Compression
Code Code Available 0ACT-Net: Asymmetric Co-Teacher Network for Semi-supervised Memory-efficient Medical Image Segmentation Jul 5, 2022 Image Segmentation Knowledge Distillation
Code Code Available 0Teach Harder, Learn Poorer: Rethinking Hard Sample Distillation for GNN-to-MLP Knowledge Distillation Jul 20, 2024 Knowledge Distillation
Code Code Available 0Ensemble Modeling with Contrastive Knowledge Distillation for Sequential Recommendation Apr 28, 2023 Attribute Contrastive Learning
Code Code Available 0Data exploitation: multi-task learning of object detection and semantic segmentation on partially annotated data Nov 7, 2023 Knowledge Distillation Multi-Task Learning
Code Code Available 0Parallel Blockwise Knowledge Distillation for Deep Neural Network Compression Dec 5, 2020 Knowledge Distillation Neural Network Compression
Code Code Available 0Align-to-Distill: Trainable Attention Alignment for Knowledge Distillation in Neural Machine Translation Mar 3, 2024 Knowledge Distillation Machine Translation
Code Code Available 0DASK: Distribution Rehearsing via Adaptive Style Kernel Learning for Exemplar-Free Lifelong Person Re-Identification Dec 12, 2024 Exemplar-Free Knowledge Distillation
Code Code Available 0DAD++: Improved Data-free Test Time Adversarial Defense Sep 10, 2023 Adversarial Defense Adversarial Robustness
Code Code Available 0Ensemble Learning via Knowledge Transfer for CTR Prediction Nov 25, 2024 Click-Through Rate Prediction Ensemble Learning
Code Code Available 0Aligning (Medical) LLMs for (Counterfactual) Fairness Aug 22, 2024 counterfactual Fairness
Code Code Available 0A Tailored Pre-Training Model for Task-Oriented Dialog Generation Apr 24, 2020 Knowledge Distillation Language Modeling
Code Code Available 0Ensemble Knowledge Distillation for Learning Improved and Efficient Networks Sep 17, 2019 Ensemble Learning General Classification
Code Code Available 0Ensemble diverse hypotheses and knowledge distillation for unsupervised cross-subject adaptation Apr 15, 2022 Activity Recognition Domain Adaptation
Code Code Available 0Patient Knowledge Distillation for BERT Model Compression Aug 25, 2019 Knowledge Distillation model
Code Code Available 0Ensemble Distillation for Robust Model Fusion in Federated Learning Jun 12, 2020 BIG-bench Machine Learning Federated Learning
Code Code Available 0Enhancing Weakly-Supervised Histopathology Image Segmentation with Knowledge Distillation on MIL-Based Pseudo-Labels Jul 14, 2024 Image Segmentation Knowledge Distillation
Code Code Available 0Enhancing TinyBERT for Financial Sentiment Analysis Using GPT-Augmented FinBERT Distillation Sep 19, 2024 Data Augmentation Edge-computing
Code Code Available 0DAdEE: Unsupervised Domain Adaptation in Early Exit PLMs Oct 6, 2024 Domain Adaptation Knowledge Distillation
Code Code Available 0Self-supervised Knowledge Distillation Using Singular Value Decomposition Jul 18, 2018 Knowledge Distillation Transfer Learning
Code Code Available 0Enhancing Scene Classification in Cloudy Image Scenarios: A Collaborative Transfer Method with Information Regulation Mechanism using Optical Cloud-Covered and SAR Remote Sensing Images Jan 8, 2025 Cloud Removal Knowledge Distillation
Code Code Available 0Enhancing New-item Fairness in Dynamic Recommender Systems Apr 30, 2025 Fairness Knowledge Distillation
Code Code Available 0D^2TV: Dual Knowledge Distillation and Target-oriented Vision Modeling for Many-to-Many Multimodal Summarization May 22, 2023 Knowledge Distillation
Code Code Available 0cViL: Cross-Lingual Training of Vision-Language Models using Knowledge Distillation Jun 7, 2022 Knowledge Distillation Question Answering
Code Code Available 0Teaching MLPs to Master Heterogeneous Graph-Structured Knowledge for Efficient and Accurate Inference Nov 21, 2024 Graph Learning Knowledge Distillation
Code Code Available 0Uniformity First: Uniformity-aware Test-time Adaptation of Vision-language Models against Image Corruption May 19, 2025 Knowledge Distillation Test-time Adaptation
Code Code Available 0Meta-Learned Modality-Weighted Knowledge Distillation for Robust Multi-Modal Learning with Missing Data May 12, 2024 Brain Tumor Segmentation Classification
Code Code Available 0Customizing Synthetic Data for Data-Free Student Learning Jul 10, 2023 Data-free Knowledge Distillation Knowledge Distillation
Code Code Available 0Enhancing Low-Resource NMT with a Multilingual Encoder and Knowledge Distillation: A Case Study Jul 9, 2024 Knowledge Distillation Language Modeling
Code Code Available 0CXR Segmentation by AdaIN-based Domain Adaptation and Knowledge Distillation Apr 13, 2021 Domain Adaptation Knowledge Distillation
Code Code Available 0Boosting Cross-Domain Point Classification via Distilling Relational Priors from 2D Transformers Jul 26, 2024 Domain Adaptation Domain Generalization
Code Code Available 0Unifying Heterogeneous Classifiers with Distillation Apr 12, 2019 Knowledge Distillation
Code Code Available 0Blind Knowledge Distillation for Robust Image Classification Nov 21, 2022 Classification image-classification
Code Code Available 0Enhancing Knowledge Distillation of Large Language Models through Efficient Multi-Modal Distribution Alignment Sep 19, 2024 Knowledge Distillation Model Compression
Code Code Available 0CSE: Surface Anomaly Detection with Contrastively Selected Embedding Mar 4, 2024 Anomaly Detection Knowledge Distillation
Code Code Available 0Periodic Intra-Ensemble Knowledge Distillation for Reinforcement Learning Feb 1, 2020 Knowledge Distillation MuJoCo
Code Code Available 0Cross-View Consistency Regularisation for Knowledge Distillation Dec 21, 2024 Knowledge Distillation
Code Code Available 0A Systematic Study of Knowledge Distillation for Natural Language Generation with Pseudo-Target Training May 3, 2023 Knowledge Distillation Text Generation
Code Code Available 0Cross-modal Knowledge Distillation for Vision-to-Sensor Action Recognition Oct 8, 2021 Action Recognition Activity Recognition
Code Code Available 0Cross Modality Knowledge Distillation for Multi-Modal Aerial View Object Classification Jun 19, 2021 Image Classification Knowledge Distillation
Code Code Available 0Unifying Synergies between Self-supervised Learning and Dynamic Computation Jan 22, 2023 image-classification Image Classification
Code Code Available 0SELF-VS: Self-supervised Encoding Learning For Video Summarization Mar 28, 2023 Knowledge Distillation Representation Learning
Code Code Available 0TQCompressor: improving tensor decomposition methods in neural networks via permutations Jan 29, 2024 Knowledge Distillation Model Compression
Code Code Available 0Technical Report for the 5th CLVision Challenge at CVPR: Addressing the Class-Incremental with Repetition using Unlabeled Data -- 4th Place Solution Mar 19, 2025 class-incremental learning Class Incremental Learning
Code Code Available 0Enhancing Knowledge Distillation for LLMs with Response-Priming Prompting Dec 18, 2024 GSM8K Knowledge Distillation
Code Code Available 0Enhancing Adversarial Robustness in Low-Label Regime via Adaptively Weighted Regularization and Knowledge Distillation Aug 8, 2023 Adversarial Robustness Knowledge Distillation
Code Code Available 0