Distillation from Heterogeneous Models for Top-K Recommendation Mar 2, 2023 Knowledge Distillation Recommendation Systems
Code Code Available 15 Distilled Semantics for Comprehensive Scene Understanding from Videos Mar 31, 2020 Depth Estimation Knowledge Distillation
Code Code Available 15 DistilCSE: Effective Knowledge Distillation For Contrastive Sentence Embeddings Dec 10, 2021 Contrastive Learning Knowledge Distillation
Code Code Available 15 Discriminator-Cooperated Feature Map Distillation for GAN Compression Dec 29, 2022 Image Generation Knowledge Distillation
Code Code Available 15 Boosting Light-Weight Depth Estimation Via Knowledge Distillation May 13, 2021 Computational Efficiency Depth Estimation
Code Code Available 15 Block-Wisely Supervised Neural Architecture Search With Knowledge Distillation Jun 1, 2020 Knowledge Distillation Neural Architecture Search
Code Code Available 15 Bridging the Domain Gap: Self-Supervised 3D Scene Understanding with Foundation Models May 15, 2023 3D Object Detection Image Captioning
Code Code Available 15 Disentangle and Remerge: Interventional Knowledge Distillation for Few-Shot Object Detection from A Conditional Causal Perspective Aug 26, 2022 Few-Shot Learning Few-Shot Object Detection
Code Code Available 15 A Sentence Speaks a Thousand Images: Domain Generalization through Distilling CLIP with Language Guidance Sep 21, 2023 Domain Generalization Knowledge Distillation
Code Code Available 15 Blockwisely Supervised Neural Architecture Search with Knowledge Distillation Nov 29, 2019 Knowledge Distillation Neural Architecture Search
Code Code Available 15 SKDF: A Simple Knowledge Distillation Framework for Distilling Open-Vocabulary Knowledge to Open-world Object Detector Dec 14, 2023 Knowledge Distillation Object
Code Code Available 15 Distilling Knowledge from Graph Convolutional Networks Mar 23, 2020 Knowledge Distillation Transfer Learning
Code Code Available 15 A semi-supervised Teacher-Student framework for surgical tool detection and localization Aug 21, 2022 Knowledge Distillation Pseudo Label
Code Code Available 15 Distillation-Based Training for Multi-Exit Architectures Oct 1, 2019 Knowledge Distillation
Code Code Available 15 BPKD: Boundary Privileged Knowledge Distillation For Semantic Segmentation Jun 13, 2023 Knowledge Distillation Segmentation
Code Code Available 15 Breaking Modality Gap in RGBT Tracking: Coupled Knowledge Distillation Oct 15, 2024 Knowledge Distillation Rgb-T Tracking
Code Code Available 15 Black-box Few-shot Knowledge Distillation Jul 25, 2022 image-classification Image Classification
Code Code Available 15 Bridge Past and Future: Overcoming Information Asymmetry in Incremental Object Detection Jul 16, 2024 Knowledge Distillation object-detection
Code Code Available 15 Adversarially Robust Distillation May 23, 2019 Adversarial Robustness Knowledge Distillation
Code Code Available 15 Fcaformer: Forward Cross Attention in Hybrid Vision Transformer Nov 14, 2022 Image Classification Knowledge Distillation
Code Code Available 15 CaMEL: Mean Teacher Learning for Image Captioning Feb 21, 2022 Image Captioning Knowledge Distillation
Code Code Available 15 CaKDP: Category-aware Knowledge Distillation and Pruning Framework for Lightweight 3D Object Detection Jan 1, 2024 3D Object Detection Knowledge Distillation
Code Code Available 15 Can LLM Watermarks Robustly Prevent Unauthorized Knowledge Distillation? Feb 17, 2025 Knowledge Distillation Language Modeling
Code Code Available 15 BKDSNN: Enhancing the Performance of Learning-based Spiking Neural Networks Training with Blurred Knowledge Distillation Jul 12, 2024 Knowledge Distillation
Code Code Available 15 A Fast Knowledge Distillation Framework for Visual Recognition Dec 2, 2021 image-classification Image Classification
Code Code Available 15 Action knowledge for video captioning with graph neural networks Mar 16, 2023 Action Recognition Graph Neural Network
Code Code Available 15 Black-Box Attacks on Sequential Recommenders via Data-Free Model Extraction Sep 1, 2021 Data Poisoning Knowledge Distillation
Code Code Available 15 Prototype-based Incremental Few-Shot Semantic Segmentation Nov 30, 2020 Few-Shot Semantic Segmentation Incremental Learning
Code Code Available 15 Boosting Multi-Label Image Classification with Complementary Parallel Self-Distillation May 23, 2022 image-classification Image Classification
Code Code Available 15 Distilling DETR with Visual-Linguistic Knowledge for Open-Vocabulary Object Detection Jan 1, 2023 Knowledge Distillation Language Modeling
Code Code Available 15 Categorical Relation-Preserving Contrastive Knowledge Distillation for Medical Image Classification Jul 7, 2021 Classification image-classification
Code Code Available 15 CCL: Continual Contrastive Learning for LiDAR Place Recognition Mar 24, 2023 Autonomous Driving Continual Learning
Code Code Available 15 CEN-HDR: Computationally Efficient neural Network for real-time High Dynamic Range imaging Feb 10, 2023 Efficient Neural Network Knowledge Distillation
Code Code Available 15 CEKD: Cross-Modal Edge-Privileged Knowledge Distillation for Semantic Scene Understanding Using Only Thermal Images Feb 22, 2023 Knowledge Distillation Scene Understanding
Code Code Available 15 A framework for benchmarking class-out-of-distribution detection and its application to ImageNet Feb 23, 2023 Benchmarking Knowledge Distillation
Code Code Available 15 A Symmetric Dual Encoding Dense Retrieval Framework for Knowledge-Intensive Visual Question Answering Apr 26, 2023 Decoder Knowledge Distillation
Code Code Available 15 CheXseg: Combining Expert Annotations with DNN-generated Saliency Maps for X-ray Segmentation Feb 21, 2021 Image Segmentation Knowledge Distillation
Code Code Available 15 CLIP-guided Federated Learning on Heterogeneous and Long-Tailed Data Dec 14, 2023 Contrastive Learning Federated Learning
Code Code Available 15 Circumventing Outliers of AutoAugment with Knowledge Distillation Mar 25, 2020 Data Augmentation General Classification
Code Code Available 15 Chinese grammatical error correction based on knowledge distillation Jul 31, 2022 Grammatical Error Correction Knowledge Distillation
Code Code Available 15 Class Attention Transfer Based Knowledge Distillation Apr 25, 2023 Knowledge Distillation Model Compression
Code Code Available 15 Distilling Large Vision-Language Model with Out-of-Distribution Generalizability Jul 6, 2023 Few-Shot Image Classification Image Classification
Code Code Available 15 DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter Oct 2, 2019 Hate Speech Detection Knowledge Distillation
Code Code Available 15 Class-Incremental Learning by Knowledge Distillation with Adaptive Feature Consolidation Apr 2, 2022 class-incremental learning Class Incremental Learning
Code Code Available 15 ABKD: Pursuing a Proper Allocation of the Probability Mass in Knowledge Distillation via α-β-Divergence May 7, 2025 Knowledge Distillation
Code Code Available 15 Distilling Object Detectors with Feature Richness Nov 1, 2021 Knowledge Distillation Model Compression
Code Code Available 15 A Token is Worth over 1,000 Tokens: Efficient Knowledge Distillation through Low-Rank Clone May 19, 2025 Knowledge Distillation Transfer Learning
Code Code Available 15 Class-incremental Novel Class Discovery Jul 18, 2022 Incremental Learning Knowledge Distillation
Code Code Available 15 Class-relation Knowledge Distillation for Novel Class Discovery Jul 18, 2023 Knowledge Distillation Novel Class Discovery
Code Code Available 15 Distilled Split Deep Neural Networks for Edge-Assisted Real-Time Systems Oct 1, 2019 Edge-computing Image Classification
Code Code Available 15