Distilled Neural Networks for Efficient Learning to Rank Feb 22, 2022 CPU Information Retrieval
Code Code Available 0MT-PATCHER: Selective and Extendable Knowledge Distillation from Large Language Models for Machine Translation Mar 14, 2024 Knowledge Distillation Machine Translation
Code Code Available 0MuGSI: Distilling GNNs with Multi-Granularity Structural Information for Graph Classification Jun 28, 2024 Classification Graph Classification
Code Code Available 0Distilled Gradual Pruning with Pruned Fine-tuning Feb 15, 2024 Image Classification Knowledge Distillation
Code Code Available 0ST-MFNet Mini: Knowledge Distillation-Driven Frame Interpolation Feb 16, 2023 Knowledge Distillation Network Pruning
Code Code Available 0Multi-aspect Knowledge Distillation with Large Language Model Jan 23, 2025 image-classification Image Classification
Code Code Available 0Distilled GPT for Source Code Summarization Aug 28, 2023 Code Summarization GPU
Code Code Available 0Adv-KD: Adversarial Knowledge Distillation for Faster Diffusion Sampling May 31, 2024 Denoising Image Generation
Code Code Available 0Distill-DBDGAN: Knowledge Distillation and Adversarial Learning Framework for Defocus Blur Detection Feb 1, 2023 Defocus Blur Detection Generative Adversarial Network
Code Code Available 0Stolen Subwords: Importance of Vocabularies for Machine Translation Model Stealing Jan 29, 2024 Knowledge Distillation Machine Translation
Code Code Available 0Multi-fidelity Neural Architecture Search with Knowledge Distillation Jun 15, 2020 Knowledge Distillation Neural Architecture Search
Code Code Available 0StrassenNets: Deep Learning with a Multiplication Budget Dec 11, 2017 Deep Learning image-classification
Code Code Available 0DistillCSE: Distilled Contrastive Learning for Sentence Embeddings Oct 20, 2023 Contrastive Learning Knowledge Distillation
Code Code Available 0Towards a Unified Conversational Recommendation System: Multi-task Learning via Contextualized Knowledge Distillation Oct 27, 2023 Conversational Recommendation Diversity
Code Code Available 0Distillation Techniques for Pseudo-rehearsal Based Incremental Learning Jul 8, 2018 Incremental Learning Knowledge Distillation
Code Code Available 0GOTHAM: Graph Class Incremental Learning Framework under Weak Supervision Apr 7, 2025 Attribute class-incremental learning
Code Code Available 0Multi-granularity for knowledge distillation Aug 15, 2021 Knowledge Distillation Person Re-Identification
Code Code Available 0uDistil-Whisper: Label-Free Data Filtering for Knowledge Distillation in Low-Data Regimes Jul 1, 2024 Knowledge Distillation
Code Code Available 0Multi-Granularity Structural Knowledge Distillation for Language Model Compression May 1, 2022 Knowledge Distillation Language Modeling
Code Code Available 0WARLearn: Weather-Adaptive Representation Learning Nov 21, 2024 2D Object Detection Adversarial Robustness
Code Code Available 0Distillation Learning Guided by Image Reconstruction for One-Shot Medical Image Segmentation Aug 7, 2024 Data Augmentation Image Reconstruction
Code Code Available 0UFIN: Universal Feature Interaction Network for Multi-Domain Click-Through Rate Prediction Nov 27, 2023 Click-Through Rate Prediction Knowledge Distillation
Code Code Available 0Classification Under Misspecification: Halfspaces, Generalized Linear Models, and Connections to Evolvability Jun 8, 2020 Fairness General Classification
Code Code Available 0Spending Your Winning Lottery Better After Drawing It Jan 8, 2021 Knowledge Distillation
Code Code Available 0Goldfish: An Efficient Federated Unlearning Framework Apr 4, 2024 Knowledge Distillation Machine Unlearning
Code Code Available 0Goal-Conditioned Q-Learning as Knowledge Distillation Aug 28, 2022 Knowledge Distillation Q-Learning
Code Code Available 0Curriculum-scheduled Knowledge Distillation from Multiple Pre-trained Teachers for Multi-domain Sequential Recommendation Jan 1, 2024 Knowledge Distillation Recommendation Systems
Code Code Available 0GNN's Uncertainty Quantification using Self-Distillation Jun 24, 2025 Knowledge Distillation Uncertainty Quantification
Code Code Available 0GLiRA: Black-Box Membership Inference Attack via Knowledge Distillation May 13, 2024 image-classification Image Classification
Code Code Available 0GLANCE: Global to Local Architecture-Neutral Concept-based Explanations Jul 5, 2022 Disentanglement Feature Importance
Code Code Available 0GKT: A Novel Guidance-Based Knowledge Transfer Framework For Efficient Cloud-edge Collaboration LLM Deployment May 30, 2024 GSM8K Knowledge Distillation
Code Code Available 0GKD: Semi-supervised Graph Knowledge Distillation for Graph-Independent Inference Apr 8, 2021 Disease Prediction graph construction
Code Code Available 0Structural Knowledge Distillation: Tractably Distilling Information for Structured Predictor Oct 10, 2020 Dependency Parsing Knowledge Distillation
Code Code Available 0Revisiting Cross-Modal Knowledge Distillation: A Disentanglement Approach for RGBD Semantic Segmentation May 30, 2025 Autonomous Driving Contrastive Learning
Code Code Available 0Multilingual Neural Machine Translation with Knowledge Distillation Feb 27, 2019 Diversity Knowledge Distillation
Code Code Available 0Multilingual Non-Autoregressive Machine Translation without Knowledge Distillation Feb 6, 2025 Knowledge Distillation Machine Translation
Code Code Available 0Automated Knowledge Distillation via Monte Carlo Tree Search Jan 1, 2023 image-classification Image Classification
Code Code Available 0Generative Denoise Distillation: Simple Stochastic Noises Induce Efficient Knowledge Transfer for Dense Prediction Jan 16, 2024 Instance Segmentation Knowledge Distillation
Code Code Available 0Distillation Improves Visual Place Recognition for Low Quality Images Oct 10, 2023 Knowledge Distillation Quantization
Code Code Available 0Revisiting Distillation and Incremental Classifier Learning Jul 8, 2018 Incremental Learning Knowledge Distillation
Code Code Available 0Generate, Annotate, and Learn: NLP with Synthetic Text Jun 11, 2021 Few-Shot Learning Image Classification
Code Code Available 0Warmup-Distill: Bridge the Distribution Mismatch between Teacher and Student before Knowledge Distillation Feb 17, 2025 Knowledge Distillation Math
Code Code Available 0Multimodal Fusion SLAM with Fourier Attention Jun 22, 2025 Knowledge Distillation Optical Flow Estimation
Code Code Available 0Multimodal Industrial Anomaly Detection by Crossmodal Reverse Distillation Dec 12, 2024 Anomaly Detection Knowledge Distillation
Code Code Available 0Revisiting Intermediate Layer Distillation for Compressing Language Models: An Overfitting Perspective Feb 3, 2023 Knowledge Distillation
Code Code Available 0Generalizing Teacher Networks for Effective Knowledge Distillation Across Student Architectures Jul 22, 2024 Knowledge Distillation Model Compression
Code Code Available 0Generalized Knowledge Distillation via Relationship Matching May 4, 2022 Few-Shot Learning Incremental Learning
Code Code Available 0Generalization Matters: Loss Minima Flattening via Parameter Hybridization for Efficient Online Knowledge Distillation Mar 26, 2023 Knowledge Distillation
Code Code Available 0Why Skip If You Can Combine: A Simple Knowledge Distillation Technique for Intermediate Layers Oct 6, 2020 Knowledge Distillation Machine Translation
Code Code Available 0Revisiting Knowledge Distillation: An Inheritance and Exploration Framework Jul 1, 2021 Knowledge Distillation
Code Code Available 0