Distilling Knowledge from Refinement in Multiple Instance Detection Networks Apr 23, 2020 Knowledge Distillation Multiple Instance Learning
Code Code Available 15 Distilling Knowledge via Knowledge Review Apr 19, 2021 Instance Segmentation Knowledge Distillation
Code Code Available 15 A Sentence Speaks a Thousand Images: Domain Generalization through Distilling CLIP with Language Guidance Sep 21, 2023 Domain Generalization Knowledge Distillation
Code Code Available 15 Compressing Deep Graph Neural Networks via Adversarial Knowledge Distillation May 24, 2022 Graph Classification Knowledge Distillation
Code Code Available 15 Complementary Relation Contrastive Distillation Mar 29, 2021 Knowledge Distillation Relation
Code Code Available 15 A semi-supervised Teacher-Student framework for surgical tool detection and localization Aug 21, 2022 Knowledge Distillation Pseudo Label
Code Code Available 15 Attention Weighted Local Descriptors Apr 19, 2023 3D Reconstruction Homography Estimation
Code Code Available 15 Aggretriever: A Simple Approach to Aggregate Textual Representations for Robust Dense Passage Retrieval Jul 31, 2022 Knowledge Distillation Language Modeling
Code Code Available 15 Dynamic Knowledge Distillation for Pre-trained Language Models Sep 23, 2021 Knowledge Distillation
Code Code Available 15 Domain Consistency Representation Learning for Lifelong Person Re-Identification Sep 30, 2024 Attribute Knowledge Distillation
Code Code Available 15 AGKD-BML: Defense Against Adversarial Attack by Attention Guided Knowledge Distillation and Bi-directional Metric Learning Aug 13, 2021 Adversarial Attack Adversarial Robustness
Code Code Available 15 Audio Embeddings as Teachers for Music Classification Jun 30, 2023 Classification Information Retrieval
Code Code Available 15 CaMEL: Mean Teacher Learning for Image Captioning Feb 21, 2022 Image Captioning Knowledge Distillation
Code Code Available 15 Computation-Efficient Knowledge Distillation via Uncertainty-Aware Mixup Dec 17, 2020 Informativeness Knowledge Distillation
Code Code Available 15 CaKDP: Category-aware Knowledge Distillation and Pruning Framework for Lightweight 3D Object Detection Jan 1, 2024 3D Object Detection Knowledge Distillation
Code Code Available 15 Agree to Disagree: Adaptive Ensemble Knowledge Distillation in Gradient Space Dec 1, 2020 Diversity Knowledge Distillation
Code Code Available 15 Action knowledge for video captioning with graph neural networks Mar 16, 2023 Action Recognition Graph Neural Network
Code Code Available 15 Conformer and Blind Noisy Students for Improved Image Quality Assessment Apr 27, 2022 Image Quality Assessment Image Restoration
Code Code Available 15 AICSD: Adaptive Inter-Class Similarity Distillation for Semantic Segmentation Aug 8, 2023 Knowledge Distillation Semantic Segmentation
Code Code Available 15 CoNMix for Source-free Single and Multi-target Domain Adaptation Nov 7, 2022 Domain Adaptation Knowledge Distillation
Code Code Available 15 Camera clustering for scalable stream-based active distillation Apr 16, 2024 Clustering Knowledge Distillation
Code Code Available 15 Consistent Representation Learning for Continual Relation Extraction Mar 5, 2022 Continual Relation Extraction Contrastive Learning
Code Code Available 15 Designing Large Foundation Models for Efficient Training and Inference: A Survey Sep 3, 2024 Knowledge Distillation Model Compression
Code Code Available 15 Content-Aware GAN Compression Apr 6, 2021 Image Generation Image Manipulation
Code Code Available 15 Channel Gating Neural Networks May 29, 2018 Knowledge Distillation Network Pruning
Code Code Available 15 Distilling Large Vision-Language Model with Out-of-Distribution Generalizability Jul 6, 2023 Few-Shot Image Classification Image Classification
Code Code Available 15 AIM 2024 Challenge on UHD Blind Photo Quality Assessment Sep 24, 2024 4k Computational Efficiency
Code Code Available 15 Continual All-in-One Adverse Weather Removal with Knowledge Replay on a Unified Network Structure Mar 12, 2024 All Continual Learning
Code Code Available 15 Continual Collaborative Distillation for Recommender System May 29, 2024 Knowledge Distillation Recommendation Systems
Code Code Available 15 Autoencoders as Cross-Modal Teachers: Can Pretrained 2D Image Transformers Help 3D Representation Learning? Dec 16, 2022 3D Point Cloud Classification Few-Shot 3D Point Cloud Classification
Code Code Available 15 AutoGAN-Distiller: Searching to Compress Generative Adversarial Networks Jun 15, 2020 AutoML Knowledge Distillation
Code Code Available 15 Continual evaluation for lifelong learning: Identifying the stability gap May 26, 2022 Continual Learning Incremental Learning
Code Code Available 15 Distilling Visual Priors from Self-Supervised Learning Aug 1, 2020 Classification Contrastive Learning
Code Code Available 15 Distilling Dense Representations for Ranking using Tightly-Coupled Teachers Oct 22, 2020 Knowledge Distillation
Code Code Available 15 Contrastive Model Inversion for Data-Free Knowledge Distillation May 18, 2021 Contrastive Learning Data-free Knowledge Distillation
Code Code Available 15 Cross-Level Distillation and Feature Denoising for Cross-Domain Few-Shot Classification Nov 4, 2023 Classification Cross-Domain Few-Shot
Code Code Available 15 DialoKG: Knowledge-Structure Aware Task-Oriented Dialogue Generation Apr 19, 2022 Dialogue Generation Knowledge Distillation
Code Code Available 15 Learning Efficient Vision Transformers via Fine-Grained Manifold Distillation Jul 3, 2021 Knowledge Distillation Model Compression
Code Code Available 15 Ego-Exo: Transferring Visual Representations from Third-person to First-person Videos Apr 16, 2021 Activity Recognition Diversity
Code Code Available 15 Eliminating Backdoor Triggers for Deep Neural Networks Using Attention Relation Graph Distillation Apr 21, 2022 backdoor defense Knowledge Distillation
Code Code Available 15 CascadeBERT: Accelerating Inference of Pre-trained Language Models via Calibrated Complete Models Cascade Dec 29, 2020 Knowledge Distillation Model Selection
Code Code Available 15 Cross-Layer Distillation with Semantic Calibration Dec 6, 2020 Knowledge Distillation Transfer Learning
Code Code Available 15 Avatar Knowledge Distillation: Self-ensemble Teacher Paradigm with Uncertainty May 4, 2023 Knowledge Distillation object-detection
Code Code Available 15 A Knowledge Distillation Framework For Enhancing Ear-EEG Based Sleep Staging With Scalp-EEG Data Oct 27, 2022 Domain Adaptation EEG
Code Code Available 15 Cross-category Video Highlight Detection via Set-based Learning Aug 26, 2021 Domain Adaptation Highlight Detection
Code Code Available 15 Creating Something from Nothing: Unsupervised Knowledge Distillation for Cross-Modal Hashing Apr 1, 2020 Knowledge Distillation Retrieval
Code Code Available 15 Bridging the Domain Gap: Self-Supervised 3D Scene Understanding with Foundation Models May 15, 2023 3D Object Detection Image Captioning
Code Code Available 15 Distilling DETR with Visual-Linguistic Knowledge for Open-Vocabulary Object Detection Jan 1, 2023 Knowledge Distillation Language Modeling
Code Code Available 15 Are Intermediate Layers and Labels Really Necessary? A General Language Model Distillation Method Jun 11, 2023 Knowledge Distillation Language Modeling
Code Code Available 15 Bridging Cross-task Protocol Inconsistency for Distillation in Dense Object Detection Aug 28, 2023 Binary Classification Classification
Code Code Available 15