Attention Distillation: self-supervised vision transformer students need more guidance Oct 3, 2022 Knowledge Distillation Self-Supervised Learning
Code Code Available 1AdaDistill: Adaptive Knowledge Distillation for Deep Face Recognition Jul 1, 2024 Face Recognition Knowledge Distillation
Code Code Available 1CrossMatch: Enhance Semi-Supervised Medical Image Segmentation with Perturbation Strategies and Knowledge Distillation May 1, 2024 Image Segmentation Knowledge Distillation
Code Code Available 1A Sentence Speaks a Thousand Images: Domain Generalization through Distilling CLIP with Language Guidance Sep 21, 2023 Domain Generalization Knowledge Distillation
Code Code Available 1Distilled Split Deep Neural Networks for Edge-Assisted Real-Time Systems Oct 1, 2019 Edge-computing Image Classification
Code Code Available 1Distilling a Powerful Student Model via Online Knowledge Distillation Mar 26, 2021 Knowledge Distillation
Code Code Available 1Attention Weighted Local Descriptors Apr 19, 2023 3D Reconstruction Homography Estimation
Code Code Available 1Aggretriever: A Simple Approach to Aggregate Textual Representations for Robust Dense Passage Retrieval Jul 31, 2022 Knowledge Distillation Language Modeling
Code Code Available 1Bi-directional Weakly Supervised Knowledge Distillation for Whole Slide Image Classification Oct 7, 2022 Classification image-classification
Code Code Available 1Domain Consistency Representation Learning for Lifelong Person Re-Identification Sep 30, 2024 Attribute Knowledge Distillation
Code Code Available 1AGKD-BML: Defense Against Adversarial Attack by Attention Guided Knowledge Distillation and Bi-directional Metric Learning Aug 13, 2021 Adversarial Attack Adversarial Robustness
Code Code Available 1Audio Embeddings as Teachers for Music Classification Jun 30, 2023 Classification Information Retrieval
Code Code Available 1Distilling Holistic Knowledge with Graph Neural Networks Aug 12, 2021 Knowledge Distillation
Code Code Available 1Distilling Image Classifiers in Object Detectors Jun 9, 2021 Knowledge Distillation Object
Code Code Available 1Audio-Visual Representation Learning via Knowledge Distillation from Speech Foundation Models Feb 9, 2025 Audio-Visual Speech Recognition Automatic Speech Recognition
Code Code Available 1Agree to Disagree: Adaptive Ensemble Knowledge Distillation in Gradient Space Dec 1, 2020 Diversity Knowledge Distillation
Code Code Available 1Augmentation-Free Dense Contrastive Knowledge Distillation for Efficient Semantic Segmentation Dec 7, 2023 Contrastive Learning Data Augmentation
Code Code Available 1A semi-supervised Teacher-Student framework for surgical tool detection and localization Aug 21, 2022 Knowledge Distillation Pseudo Label
Code Code Available 1AICSD: Adaptive Inter-Class Similarity Distillation for Semantic Segmentation Aug 8, 2023 Knowledge Distillation Semantic Segmentation
Code Code Available 1Distilling Knowledge via Knowledge Review Apr 19, 2021 Instance Segmentation Knowledge Distillation
Code Code Available 1Contrastive Model Inversion for Data-Free Knowledge Distillation May 18, 2021 Contrastive Learning Data-free Knowledge Distillation
Code Code Available 1Distilling Meta Knowledge on Heterogeneous Graph for Illicit Drug Trafficker Detection on Social Media Dec 1, 2021 Knowledge Distillation Marketing
Code Code Available 1Distilling Object Detectors with Feature Richness Nov 1, 2021 Knowledge Distillation Model Compression
Code Code Available 1Distilling Out-of-Distribution Robustness from Vision-Language Foundation Models Nov 2, 2023 Data Augmentation Domain Generalization
Code Code Available 1BERT-of-Theseus: Compressing BERT by Progressive Module Replacing Feb 7, 2020 Knowledge Distillation Model Compression
Code Code Available 1Action knowledge for video captioning with graph neural networks Mar 16, 2023 Action Recognition Graph Neural Network
Code Code Available 1AIM 2024 Challenge on UHD Blind Photo Quality Assessment Sep 24, 2024 4k Computational Efficiency
Code Code Available 1Distill on the Go: Online knowledge distillation in self-supervised learning Apr 20, 2021 Knowledge Distillation Self-Supervised Learning
Code Code Available 1Contrastive Distillation on Intermediate Representations for Language Model Compression Sep 29, 2020 Knowledge Distillation Language Modeling
Code Code Available 1Autoencoders as Cross-Modal Teachers: Can Pretrained 2D Image Transformers Help 3D Representation Learning? Dec 16, 2022 3D Point Cloud Classification Few-Shot 3D Point Cloud Classification
Code Code Available 1AutoGAN-Distiller: Searching to Compress Generative Adversarial Networks Jun 15, 2020 AutoML Knowledge Distillation
Code Code Available 1DistilVPR: Cross-Modal Knowledge Distillation for Visual Place Recognition Dec 17, 2023 Knowledge Distillation Visual Place Recognition
Code Code Available 1Contrastive Representation Distillation Oct 23, 2019 Contrastive Learning Knowledge Distillation
Code Code Available 1Cross-Modal Fusion Distillation for Fine-Grained Sketch-Based Image Retrieval Oct 19, 2022 Cross-Modal Retrieval Image Retrieval
Code Code Available 1Block-Wisely Supervised Neural Architecture Search With Knowledge Distillation Jun 1, 2020 Knowledge Distillation Neural Architecture Search
Code Code Available 1Divide to Adapt: Mitigating Confirmation Bias for Domain Adaptation of Black-Box Predictors May 28, 2022 Domain Adaptation Knowledge Distillation
Code Code Available 1Data-Free Class-Incremental Hand Gesture Recognition Jan 1, 2023 class-incremental learning Class Incremental Learning
Code Code Available 1DnS: Distill-and-Select for Efficient and Accurate Video Indexing and Retrieval Jun 24, 2021 Computational Efficiency Knowledge Distillation
Code Code Available 1Black-Box Attacks on Sequential Recommenders via Data-Free Model Extraction Sep 1, 2021 Data Poisoning Knowledge Distillation
Code Code Available 1Does Knowledge Distillation Really Work? Jun 10, 2021 Knowledge Distillation
Code Code Available 1CascadeBERT: Accelerating Inference of Pre-trained Language Models via Calibrated Complete Models Cascade Dec 29, 2020 Knowledge Distillation Model Selection
Code Code Available 1DPM-OT: A New Diffusion Probabilistic Model Based on Optimal Transport Jul 21, 2023 Denoising Knowledge Distillation
Code Code Available 1Avatar Knowledge Distillation: Self-ensemble Teacher Paradigm with Uncertainty May 4, 2023 Knowledge Distillation object-detection
Code Code Available 1A Knowledge Distillation Framework For Enhancing Ear-EEG Based Sleep Staging With Scalp-EEG Data Oct 27, 2022 Domain Adaptation EEG
Code Code Available 1Dual Relation Knowledge Distillation for Object Detection Feb 11, 2023 Knowledge Distillation Model Compression
Code Code Available 1Continual Collaborative Distillation for Recommender System May 29, 2024 Knowledge Distillation Recommendation Systems
Code Code Available 1Continual All-in-One Adverse Weather Removal with Knowledge Replay on a Unified Network Structure Mar 12, 2024 All Continual Learning
Code Code Available 1Continual evaluation for lifelong learning: Identifying the stability gap May 26, 2022 Continual Learning Incremental Learning
Code Code Available 1Are Intermediate Layers and Labels Really Necessary? A General Language Model Distillation Method Jun 11, 2023 Knowledge Distillation Language Modeling
Code Code Available 1Context-Aware Image Inpainting with Learned Semantic Priors Jun 14, 2021 Image Inpainting Knowledge Distillation
Code Code Available 1