Adaptive Multi-Teacher Knowledge Distillation with Meta-Learning Jun 11, 2023 Knowledge Distillation Meta-Learning
Code Code Available 15 Cross-Modal Fusion Distillation for Fine-Grained Sketch-Based Image Retrieval Oct 19, 2022 Cross-Modal Retrieval Image Retrieval
Code Code Available 15 Cross-modality Data Augmentation for End-to-End Sign Language Translation May 18, 2023 Data Augmentation Knowledge Distillation
Code Code Available 15 Block-Wisely Supervised Neural Architecture Search With Knowledge Distillation Jun 1, 2020 Knowledge Distillation Neural Architecture Search
Code Code Available 15 Black-box Few-shot Knowledge Distillation Jul 25, 2022 image-classification Image Classification
Code Code Available 15 Boosting Multi-Label Image Classification with Complementary Parallel Self-Distillation May 23, 2022 image-classification Image Classification
Code Code Available 15 Cross-Modality Knowledge Distillation Network for Monocular 3D Object Detection Nov 14, 2022 3D Object Detection Knowledge Distillation
Code Code Available 15 Curriculum Learning for Dense Retrieval Distillation Apr 28, 2022 Knowledge Distillation Passage Retrieval
Code Code Available 15 Data-Free Class-Incremental Hand Gesture Recognition Jan 1, 2023 class-incremental learning Class Incremental Learning
Code Code Available 15 Hybrid Inverted Index Is a Robust Accelerator for Dense Retrieval Oct 11, 2022 Knowledge Distillation Quantization
Code Code Available 15 Bit-mask Robust Contrastive Knowledge Distillation for Unsupervised Semantic Hashing Mar 10, 2024 Image Retrieval Knowledge Distillation
Code Code Available 15 BiLD: Bi-directional Logits Difference Loss for Large Language Model Distillation Jun 19, 2024 Knowledge Distillation Language Modeling
Code Code Available 15 Cross-Layer Distillation with Semantic Calibration Dec 6, 2020 Knowledge Distillation Transfer Learning
Code Code Available 15 BGE M3-Embedding: Multi-Lingual, Multi-Functionality, Multi-Granularity Text Embeddings Through Self-Knowledge Distillation Feb 5, 2024 Knowledge Distillation Retrieval
Code Code Available 15 Bidirectional Distillation for Top-K Recommender System Jun 5, 2021 Knowledge Distillation Model Compression
Code Code Available 15 Beyond Self-Supervision: A Simple Yet Effective Network Distillation Alternative to Improve Backbones Mar 10, 2021 Knowledge Distillation object-detection
Code Code Available 15 Bi-directional Weakly Supervised Knowledge Distillation for Whole Slide Image Classification Oct 7, 2022 Classification image-classification
Code Code Available 15 BKDSNN: Enhancing the Performance of Learning-based Spiking Neural Networks Training with Blurred Knowledge Distillation Jul 12, 2024 Knowledge Distillation
Code Code Available 15 CrossKD: Cross-Head Knowledge Distillation for Object Detection Jun 20, 2023 Dense Object Detection Knowledge Distillation
Code Code Available 15 Cross-Level Distillation and Feature Denoising for Cross-Domain Few-Shot Classification Nov 4, 2023 Classification Cross-Domain Few-Shot
Code Code Available 15 Beyond Generic: Enhancing Image Captioning with Real-World Knowledge using Vision-Language Pre-Training Model Aug 2, 2023 Hallucination Image Captioning
Code Code Available 15 Contrastive Representation Distillation Oct 23, 2019 Contrastive Learning Knowledge Distillation
Code Code Available 15 BEV-LGKD: A Unified LiDAR-Guided Knowledge Distillation Framework for BEV 3D Object Detection Dec 1, 2022 3D Object Detection Autonomous Driving
Code Code Available 15 BEVDistill: Cross-Modal BEV Distillation for Multi-View 3D Object Detection Nov 17, 2022 3D Object Detection Depth Estimation
Code Code Available 15 SimDistill: Simulated Multi-modal Distillation for BEV 3D Object Detection Mar 29, 2023 3D geometry 3D Object Detection
Code Code Available 15 Creating Something from Nothing: Unsupervised Knowledge Distillation for Cross-Modal Hashing Apr 1, 2020 Knowledge Distillation Retrieval
Code Code Available 15 Contrastive Deep Supervision Jul 12, 2022 Contrastive Learning Fine-Grained Image Classification
Code Code Available 15 Continual Learning for LiDAR Semantic Segmentation: Class-Incremental and Coarse-to-Fine strategies on Sparse Data Apr 8, 2023 class-incremental learning Class Incremental Learning
Code Code Available 15 Contrastive Distillation on Intermediate Representations for Language Model Compression Sep 29, 2020 Knowledge Distillation Language Modeling
Code Code Available 15 Continual evaluation for lifelong learning: Identifying the stability gap May 26, 2022 Continual Learning Incremental Learning
Code Code Available 15 BearingPGA-Net: A Lightweight and Deployable Bearing Fault Diagnosis Network via Decoupled Knowledge Distillation and FPGA Acceleration Jul 31, 2023 CPU Fault Diagnosis
Code Code Available 15 Continual Learning for Image Segmentation with Dynamic Query Nov 29, 2023 Continual Learning Diversity
Code Code Available 15 Contrastive Model Inversion for Data-Free Knowledge Distillation May 18, 2021 Contrastive Learning Data-free Knowledge Distillation
Code Code Available 15 Cross-category Video Highlight Detection via Set-based Learning Aug 26, 2021 Domain Adaptation Highlight Detection
Code Code Available 15 CrossMatch: Enhance Semi-Supervised Medical Image Segmentation with Perturbation Strategies and Knowledge Distillation May 1, 2024 Image Segmentation Knowledge Distillation
Code Code Available 15 Data-Free Knowledge Distillation for Heterogeneous Federated Learning May 20, 2021 Data-free Knowledge Distillation Federated Learning
Code Code Available 15 Baby Llama: knowledge distillation from an ensemble of teachers trained on a small dataset with no performance penalty Aug 3, 2023 Knowledge Distillation
Code Code Available 15 Designing Large Foundation Models for Efficient Training and Inference: A Survey Sep 3, 2024 Knowledge Distillation Model Compression
Code Code Available 15 AIM 2024 Challenge on UHD Blind Photo Quality Assessment Sep 24, 2024 4k Computational Efficiency
Code Code Available 15 Avatar Knowledge Distillation: Self-ensemble Teacher Paradigm with Uncertainty May 4, 2023 Knowledge Distillation object-detection
Code Code Available 15 AdaptGuard: Defending Against Universal Attacks for Model Adaptation Mar 19, 2023 Knowledge Distillation model
Code Code Available 15 Bayes Conditional Distribution Estimation for Knowledge Distillation Based on Conditional Mutual Information Jan 16, 2024 Knowledge Distillation
Code Code Available 15 Beyond Preserved Accuracy: Evaluating Loyalty and Robustness of BERT Compression Sep 7, 2021 Knowledge Distillation Quantization
Code Code Available 15 Black-Box Attacks on Sequential Recommenders via Data-Free Model Extraction Sep 1, 2021 Data Poisoning Knowledge Distillation
Code Code Available 15 CascadeBERT: Accelerating Inference of Pre-trained Language Models via Calibrated Complete Models Cascade Dec 29, 2020 Knowledge Distillation Model Selection
Code Code Available 15 A Knowledge Distillation Framework For Enhancing Ear-EEG Based Sleep Staging With Scalp-EEG Data Oct 27, 2022 Domain Adaptation EEG
Code Code Available 15 Better Estimation of the KL Divergence Between Language Models Apr 14, 2025 Knowledge Distillation
Code Code Available 15 BERT-of-Theseus: Compressing BERT by Progressive Module Replacing Feb 7, 2020 Knowledge Distillation Model Compression
Code Code Available 15 Adjoined Networks: A Training Paradigm with Applications to Network Compression Jun 10, 2020 Knowledge Distillation Neural Architecture Search
Code Code Available 15 Backdoor Attacks on Self-Supervised Learning May 21, 2021 Backdoor Attack Inductive Bias
Code Code Available 15