AICSD: Adaptive Inter-Class Similarity Distillation for Semantic Segmentation Aug 8, 2023 Knowledge Distillation Semantic Segmentation
Code Code Available 15 Context-Aware Image Inpainting with Learned Semantic Priors Jun 14, 2021 Image Inpainting Knowledge Distillation
Code Code Available 15 A Deep Knowledge Distillation framework for EEG assisted enhancement of single-lead ECG based sleep staging Dec 14, 2021 ECG based Sleep Staging EEG
Code Code Available 15 Continual All-in-One Adverse Weather Removal with Knowledge Replay on a Unified Network Structure Mar 12, 2024 All Continual Learning
Code Code Available 15 CaKDP: Category-aware Knowledge Distillation and Pruning Framework for Lightweight 3D Object Detection Jan 1, 2024 3D Object Detection Knowledge Distillation
Code Code Available 15 Distilling Knowledge from Graph Convolutional Networks Mar 23, 2020 Knowledge Distillation Transfer Learning
Code Code Available 15 Distillation-Based Training for Multi-Exit Architectures Oct 1, 2019 Knowledge Distillation
Code Code Available 15 Distillation Matters: Empowering Sequential Recommenders to Match the Performance of Large Language Model May 1, 2024 Knowledge Distillation Language Modeling
Code Code Available 15 Agree to Disagree: Adaptive Ensemble Knowledge Distillation in Gradient Space Dec 1, 2020 Diversity Knowledge Distillation
Code Code Available 15 DistilCSE: Effective Knowledge Distillation For Contrastive Sentence Embeddings Dec 10, 2021 Contrastive Learning Knowledge Distillation
Code Code Available 15 Discriminator-Cooperated Feature Map Distillation for GAN Compression Dec 29, 2022 Image Generation Knowledge Distillation
Code Code Available 15 AGKD-BML: Defense Against Adversarial Attack by Attention Guided Knowledge Distillation and Bi-directional Metric Learning Aug 13, 2021 Adversarial Attack Adversarial Robustness
Code Code Available 15 Bridging Cross-task Protocol Inconsistency for Distillation in Dense Object Detection Aug 28, 2023 Binary Classification Classification
Code Code Available 15 DisCo: Distilled Student Models Co-training for Semi-supervised Text Mining May 20, 2023 Extractive Summarization Knowledge Distillation
Code Code Available 15 Disentangle and Remerge: Interventional Knowledge Distillation for Few-Shot Object Detection from A Conditional Causal Perspective Aug 26, 2022 Few-Shot Learning Few-Shot Object Detection
Code Code Available 15 Aggretriever: A Simple Approach to Aggregate Textual Representations for Robust Dense Passage Retrieval Jul 31, 2022 Knowledge Distillation Language Modeling
Code Code Available 15 BPKD: Boundary Privileged Knowledge Distillation For Semantic Segmentation Jun 13, 2023 Knowledge Distillation Segmentation
Code Code Available 15 Breaking Modality Gap in RGBT Tracking: Coupled Knowledge Distillation Oct 15, 2024 Knowledge Distillation Rgb-T Tracking
Code Code Available 15 Bootstrapping meaning through listening: Unsupervised learning of spoken sentence embeddings Oct 23, 2022 Acoustic Unit Discovery Contrastive Learning
Code Code Available 15 AdaDistill: Adaptive Knowledge Distillation for Deep Face Recognition Jul 1, 2024 Face Recognition Knowledge Distillation
Code Code Available 15 Bridge Past and Future: Overcoming Information Asymmetry in Incremental Object Detection Jul 16, 2024 Knowledge Distillation object-detection
Code Code Available 15 Bridging the Domain Gap: Self-Supervised 3D Scene Understanding with Foundation Models May 15, 2023 3D Object Detection Image Captioning
Code Code Available 15 DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter Oct 2, 2019 Hate Speech Detection Knowledge Distillation
Code Code Available 15 Digging into contrastive learning for robust depth estimation with diffusion models Apr 15, 2024 Contrastive Learning Denoising
Code Code Available 15 Boosting Light-Weight Depth Estimation Via Knowledge Distillation May 13, 2021 Computational Efficiency Depth Estimation
Code Code Available 15 Extending global-local view alignment for self-supervised learning with remote sensing imagery Mar 12, 2023 Change Detection Contrastive Learning
Code Code Available 15 ABKD: Pursuing a Proper Allocation of the Probability Mass in Knowledge Distillation via α-β-Divergence May 7, 2025 Knowledge Distillation
Code Code Available 15 AgeFlow: Conditional Age Progression and Regression with Normalizing Flows May 15, 2021 Attribute Knowledge Distillation
Code Code Available 15 Boosting Multi-Label Image Classification with Complementary Parallel Self-Distillation May 23, 2022 image-classification Image Classification
Code Code Available 15 DIOD: Self-Distillation Meets Object Discovery Jan 1, 2024 Instance Segmentation Knowledge Distillation
Code Code Available 15 Diffusion-Driven Data Replay: A Novel Approach to Combat Forgetting in Federated Class Continual Learning Sep 2, 2024 Continual Learning Contrastive Learning
Code Code Available 15 Block-Wisely Supervised Neural Architecture Search With Knowledge Distillation Jun 1, 2020 Knowledge Distillation Neural Architecture Search
Code Code Available 15 Blockwisely Supervised Neural Architecture Search with Knowledge Distillation Nov 29, 2019 Knowledge Distillation Neural Architecture Search
Code Code Available 15 A framework for benchmarking class-out-of-distribution detection and its application to ImageNet Feb 23, 2023 Benchmarking Knowledge Distillation
Code Code Available 15 Dice Semimetric Losses: Optimizing the Dice Score with Soft Labels Mar 28, 2023 Knowledge Distillation
Code Code Available 15 DiGA: Distil to Generalize and then Adapt for Domain Adaptive Semantic Segmentation Apr 5, 2023 Data Augmentation Knowledge Distillation
Code Code Available 15 Directed Acyclic Graph Factorization Machines for CTR Prediction via Knowledge Distillation Nov 21, 2022 Click-Through Rate Prediction Knowledge Distillation
Code Code Available 15 Bit-mask Robust Contrastive Knowledge Distillation for Unsupervised Semantic Hashing Mar 10, 2024 Image Retrieval Knowledge Distillation
Code Code Available 15 BKDSNN: Enhancing the Performance of Learning-based Spiking Neural Networks Training with Blurred Knowledge Distillation Jul 12, 2024 Knowledge Distillation
Code Code Available 15 Hybrid Inverted Index Is a Robust Accelerator for Dense Retrieval Oct 11, 2022 Knowledge Distillation Quantization
Code Code Available 15 Prototype-based Incremental Few-Shot Semantic Segmentation Nov 30, 2020 Few-Shot Semantic Segmentation Incremental Learning
Code Code Available 15 Black-Box Attacks on Sequential Recommenders via Data-Free Model Extraction Sep 1, 2021 Data Poisoning Knowledge Distillation
Code Code Available 15 Dialogue Chain-of-Thought Distillation for Commonsense-aware Conversational Agents Oct 13, 2023 Informativeness Knowledge Distillation
Code Code Available 15 A Fast Knowledge Distillation Framework for Visual Recognition Dec 2, 2021 image-classification Image Classification
Code Code Available 15 BiLD: Bi-directional Logits Difference Loss for Large Language Model Distillation Jun 19, 2024 Knowledge Distillation Language Modeling
Code Code Available 15 Adversarially Robust Distillation May 23, 2019 Adversarial Robustness Knowledge Distillation
Code Code Available 15 Black-box Few-shot Knowledge Distillation Jul 25, 2022 image-classification Image Classification
Code Code Available 15 DGEKT: A Dual Graph Ensemble Learning Method for Knowledge Tracing Nov 23, 2022 Ensemble Learning Knowledge Distillation
Code Code Available 15 DialoKG: Knowledge-Structure Aware Task-Oriented Dialogue Generation Apr 19, 2022 Dialogue Generation Knowledge Distillation
Code Code Available 15 Directed Acyclic Transformer for Non-Autoregressive Machine Translation May 16, 2022 Knowledge Distillation Machine Translation
Code Code Available 15