Decomposed Knowledge Distillation for Class-Incremental Semantic Segmentation Oct 12, 2022 Class-Incremental Semantic Segmentation Knowledge Distillation
Code Code Available 1Dialogue Chain-of-Thought Distillation for Commonsense-aware Conversational Agents Oct 13, 2023 Informativeness Knowledge Distillation
Code Code Available 1Categorical Relation-Preserving Contrastive Knowledge Distillation for Medical Image Classification Jul 7, 2021 Classification image-classification
Code Code Available 1Dice Semimetric Losses: Optimizing the Dice Score with Soft Labels Mar 28, 2023 Knowledge Distillation
Code Code Available 1Anti-Distillation Backdoor Attacks: Backdoors Can Really Survive in Knowledge Distillation Oct 24, 2021 Backdoor Attack Knowledge Distillation
Code Code Available 1DiGA: Distil to Generalize and then Adapt for Domain Adaptive Semantic Segmentation Apr 5, 2023 Data Augmentation Knowledge Distillation
Code Code Available 1Extending global-local view alignment for self-supervised learning with remote sensing imagery Mar 12, 2023 Change Detection Contrastive Learning
Code Code Available 1DIOD: Self-Distillation Meets Object Discovery Jan 1, 2024 Instance Segmentation Knowledge Distillation
Code Code Available 1DA-Mamba: Domain Adaptive Hybrid Mamba-Transformer Based One-Stage Object Detection Feb 16, 2025 Domain Adaptation Knowledge Distillation
Code Code Available 1CCL: Continual Contrastive Learning for LiDAR Place Recognition Mar 24, 2023 Autonomous Driving Continual Learning
Code Code Available 1CLIP-KD: An Empirical Study of CLIP Model Distillation Jul 24, 2023 Contrastive Learning Cross-Modal Retrieval
Code Code Available 1CLIP model is an Efficient Continual Learner Oct 6, 2022 Continual Learning Incremental Learning
Code Code Available 1Class-relation Knowledge Distillation for Novel Class Discovery Jul 18, 2023 Knowledge Distillation Novel Class Discovery
Code Code Available 1Distilling Knowledge from Graph Convolutional Networks Mar 23, 2020 Knowledge Distillation Transfer Learning
Code Code Available 1CEKD: Cross-Modal Edge-Privileged Knowledge Distillation for Semantic Scene Understanding Using Only Thermal Images Feb 22, 2023 Knowledge Distillation Scene Understanding
Code Code Available 1CEN-HDR: Computationally Efficient neural Network for real-time High Dynamic Range imaging Feb 10, 2023 Efficient Neural Network Knowledge Distillation
Code Code Available 1Tracking-by-Trackers with a Distilled and Reinforced Model Jul 8, 2020 Knowledge Distillation Object Tracking
Code Code Available 1Distillation Matters: Empowering Sequential Recommenders to Match the Performance of Large Language Model May 1, 2024 Knowledge Distillation Language Modeling
Code Code Available 1DistillBEV: Boosting Multi-Camera 3D Object Detection with Cross-Modal Knowledge Distillation Sep 26, 2023 3D Object Detection Autonomous Driving
Code Code Available 1Channel-Aware Distillation Transformer for Depth Estimation on Nano Drones Mar 18, 2023 Autonomous Navigation Depth Estimation
Code Code Available 1Channel Distillation: Channel-Wise Attention for Knowledge Distillation Jun 2, 2020 Knowledge Distillation
Code Code Available 1Distilling a Powerful Student Model via Online Knowledge Distillation Mar 26, 2021 Knowledge Distillation
Code Code Available 1Channel Gating Neural Networks May 29, 2018 Knowledge Distillation Network Pruning
Code Code Available 1Distilling Autoregressive Models to Obtain High-Performance Non-Autoregressive Solvers for Vehicle Routing Problems with Faster Inference Speed Dec 19, 2023 Knowledge Distillation
Code Code Available 1Data-Free Network Quantization With Adversarial Knowledge Distillation May 8, 2020 Knowledge Distillation Model Compression
Code Code Available 1Channel-wise Knowledge Distillation for Dense Prediction Nov 26, 2020 Knowledge Distillation Prediction
Code Code Available 1Distilling Knowledge from Reader to Retriever for Question Answering Dec 8, 2020 Information Retrieval Knowledge Distillation
Code Code Available 1CheXseg: Combining Expert Annotations with DNN-generated Saliency Maps for X-ray Segmentation Feb 21, 2021 Image Segmentation Knowledge Distillation
Code Code Available 1Chinese grammatical error correction based on knowledge distillation Jul 31, 2022 Grammatical Error Correction Knowledge Distillation
Code Code Available 1Distilling Knowledge via Intermediate Classifiers Feb 28, 2021 Knowledge Distillation Transfer Learning
Code Code Available 1Circumventing Outliers of AutoAugment with Knowledge Distillation Mar 25, 2020 Data Augmentation General Classification
Code Code Available 1Distilling Large Vision-Language Model with Out-of-Distribution Generalizability Jul 6, 2023 Few-Shot Image Classification Image Classification
Code Code Available 1FocusNet: Classifying Better by Focusing on Confusing Classes Oct 14, 2021 Classification image-classification
Code Code Available 1Distilling Object Detectors with Feature Richness Nov 1, 2021 Knowledge Distillation Model Compression
Code Code Available 1Distilling Script Knowledge from Large Language Models for Constrained Language Planning May 9, 2023 Knowledge Distillation
Code Code Available 1Class Attention Transfer Based Knowledge Distillation Apr 25, 2023 Knowledge Distillation Model Compression
Code Code Available 1Advancing Pre-trained Teacher: Towards Robust Feature Discrepancy for Anomaly Detection May 3, 2024 Anomaly Detection Attribute
Code Code Available 1Class-Balanced Distillation for Long-Tailed Visual Recognition Apr 12, 2021 Image Classification Knowledge Distillation
Code Code Available 1Decoupled Kullback-Leibler Divergence Loss May 23, 2023 Adversarial Defense Adversarial Robustness
Code Code Available 1Distill on the Go: Online knowledge distillation in self-supervised learning Apr 20, 2021 Knowledge Distillation Self-Supervised Learning
Code Code Available 1Data-Free Knowledge Distillation for Heterogeneous Federated Learning May 20, 2021 Data-free Knowledge Distillation Federated Learning
Code Code Available 1Cloud Object Detector Adaptation by Integrating Different Source Knowledge Dec 10, 2024 Domain Adaptation Knowledge Distillation
Code Code Available 1Advantage-Guided Distillation for Preference Alignment in Small Language Models Feb 25, 2025 Knowledge Distillation
Code Code Available 1DistilVPR: Cross-Modal Knowledge Distillation for Visual Place Recognition Dec 17, 2023 Knowledge Distillation Visual Place Recognition
Code Code Available 13D Annotation-Free Learning by Distilling 2D Open-Vocabulary Segmentation Models for Autonomous Driving May 24, 2024 Autonomous Driving Knowledge Distillation
Code Code Available 1Distributed Dynamic Map Fusion via Federated Learning for Intelligent Networked Vehicles Mar 5, 2021 Federated Learning Knowledge Distillation
Code Code Available 1Class-Incremental Learning by Knowledge Distillation with Adaptive Feature Consolidation Apr 2, 2022 class-incremental learning Class Incremental Learning
Code Code Available 1Distribution-aware Forgetting Compensation for Exemplar-Free Lifelong Person Re-identification Apr 21, 2025 Exemplar-Free Knowledge Distillation
Code Code Available 1Are Intermediate Layers and Labels Really Necessary? A General Language Model Distillation Method Jun 11, 2023 Knowledge Distillation Language Modeling
Code Code Available 1Data-Free Class-Incremental Hand Gesture Recognition Jan 1, 2023 class-incremental learning Class Incremental Learning
Code Code Available 1