Black-Box Attacks on Sequential Recommenders via Data-Free Model Extraction Sep 1, 2021 Data Poisoning Knowledge Distillation
Code Code Available 15 Black-box Few-shot Knowledge Distillation Jul 25, 2022 image-classification Image Classification
Code Code Available 15 AIM 2024 Challenge on UHD Blind Photo Quality Assessment Sep 24, 2024 4k Computational Efficiency
Code Code Available 15 Adjoined Networks: A Training Paradigm with Applications to Network Compression Jun 10, 2020 Knowledge Distillation Neural Architecture Search
Code Code Available 15 Block-Wisely Supervised Neural Architecture Search With Knowledge Distillation Jun 1, 2020 Knowledge Distillation Neural Architecture Search
Code Code Available 15 Aligned Structured Sparsity Learning for Efficient Image Super-Resolution Dec 1, 2021 Image Super-Resolution Knowledge Distillation
Code Code Available 15 Can LLM Watermarks Robustly Prevent Unauthorized Knowledge Distillation? Feb 17, 2025 Knowledge Distillation Language Modeling
Code Code Available 15 Boosting Multi-Label Image Classification with Complementary Parallel Self-Distillation May 23, 2022 image-classification Image Classification
Code Code Available 15 Bootstrapping meaning through listening: Unsupervised learning of spoken sentence embeddings Oct 23, 2022 Acoustic Unit Discovery Contrastive Learning
Code Code Available 15 DeepKD: A Deeply Decoupled and Denoised Knowledge Distillation Trainer May 21, 2025 Denoising Knowledge Distillation
Code Code Available 15 Align-KD: Distilling Cross-Modal Alignment Knowledge for Mobile Vision-Language Model Dec 2, 2024 cross-modal alignment Knowledge Distillation
Code Code Available 15 Align-KD: Distilling Cross-Modal Alignment Knowledge for Mobile Vision-Language Large Model Enhancement Jan 1, 2025 cross-modal alignment Knowledge Distillation
Code Code Available 15 Breaking Modality Gap in RGBT Tracking: Coupled Knowledge Distillation Oct 15, 2024 Knowledge Distillation Rgb-T Tracking
Code Code Available 15 Bridge Past and Future: Overcoming Information Asymmetry in Incremental Object Detection Jul 16, 2024 Knowledge Distillation object-detection
Code Code Available 15 DARTS: Double Attention Reference-based Transformer for Super-resolution Jul 17, 2023 Image Super-Resolution Knowledge Distillation
Code Code Available 15 Bridging the Domain Gap: Self-Supervised 3D Scene Understanding with Foundation Models May 15, 2023 3D Object Detection Image Captioning
Code Code Available 15 C2KD: Cross-Lingual Cross-Modal Knowledge Distillation for Multilingual Text-Video Retrieval Oct 7, 2022 Knowledge Distillation Retrieval
Code Code Available 15 Fcaformer: Forward Cross Attention in Hybrid Vision Transformer Nov 14, 2022 Image Classification Knowledge Distillation
Code Code Available 15 AlphaFold Distillation for Protein Design Oct 5, 2022 Diversity Drug Discovery
Code Code Available 15 DASpeech: Directed Acyclic Transformer for Fast and High-quality Speech-to-Speech Translation Oct 11, 2023 Decoder fr-en
Code Code Available 15 CaMEL: Mean Teacher Learning for Image Captioning Feb 21, 2022 Image Captioning Knowledge Distillation
Code Code Available 15 AltDiffusion: A Multilingual Text-to-Image Diffusion Model Aug 19, 2023 Blocking Concept Alignment
Code Code Available 15 Better Estimation of the KL Divergence Between Language Models Apr 14, 2025 Knowledge Distillation
Code Code Available 15 CEN-HDR: Computationally Efficient neural Network for real-time High Dynamic Range imaging Feb 10, 2023 Efficient Neural Network Knowledge Distillation
Code Code Available 15 DialoKG: Knowledge-Structure Aware Task-Oriented Dialogue Generation Apr 19, 2022 Dialogue Generation Knowledge Distillation
Code Code Available 15 Dice Semimetric Losses: Optimizing the Dice Score with Soft Labels Mar 28, 2023 Knowledge Distillation
Code Code Available 15 CCL: Continual Contrastive Learning for LiDAR Place Recognition Mar 24, 2023 Autonomous Driving Continual Learning
Code Code Available 15 Categorical Relation-Preserving Contrastive Knowledge Distillation for Medical Image Classification Jul 7, 2021 Classification image-classification
Code Code Available 15 AMFD: Distillation via Adaptive Multimodal Fusion for Multispectral Pedestrian Detection May 21, 2024 Knowledge Distillation Pedestrian Detection
Code Code Available 15 Adaptive Multi-Teacher Knowledge Distillation with Meta-Learning Jun 11, 2023 Knowledge Distillation Meta-Learning
Code Code Available 15 Adaptive Multi-Teacher Multi-level Knowledge Distillation Mar 6, 2021 Knowledge Distillation
Code Code Available 15 Channel Distillation: Channel-Wise Attention for Knowledge Distillation Jun 2, 2020 Knowledge Distillation
Code Code Available 15 Understanding the Role of the Projector in Knowledge Distillation Mar 20, 2023 image-classification Image Classification
Code Code Available 15 Channel-Aware Distillation Transformer for Depth Estimation on Nano Drones Mar 18, 2023 Autonomous Navigation Depth Estimation
Code Code Available 15 Channel Gating Neural Networks May 29, 2018 Knowledge Distillation Network Pruning
Code Code Available 15 Channel-wise Knowledge Distillation for Dense Prediction Nov 26, 2020 Knowledge Distillation Prediction
Code Code Available 15 CheXseg: Combining Expert Annotations with DNN-generated Saliency Maps for X-ray Segmentation Feb 21, 2021 Image Segmentation Knowledge Distillation
Code Code Available 15 Directed Acyclic Transformer for Non-Autoregressive Machine Translation May 16, 2022 Knowledge Distillation Machine Translation
Code Code Available 15 BERT-of-Theseus: Compressing BERT by Progressive Module Replacing Feb 7, 2020 Knowledge Distillation Model Compression
Code Code Available 15 Class Attention Transfer Based Knowledge Distillation Apr 25, 2023 Knowledge Distillation Model Compression
Code Code Available 15 DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter Oct 2, 2019 Hate Speech Detection Knowledge Distillation
Code Code Available 15 Class-Balanced Distillation for Long-Tailed Visual Recognition Apr 12, 2021 Image Classification Knowledge Distillation
Code Code Available 15 Adapt Your Teacher: Improving Knowledge Distillation for Exemplar-free Continual Learning Aug 18, 2023 class-incremental learning Class Incremental Learning
Code Code Available 15 Class-Incremental Learning by Knowledge Distillation with Adaptive Feature Consolidation Apr 2, 2022 class-incremental learning Class Incremental Learning
Code Code Available 15 Curriculum Temperature for Knowledge Distillation Nov 29, 2022 Image Classification Knowledge Distillation
Code Code Available 15 Distillation-Based Training for Multi-Exit Architectures Oct 1, 2019 Knowledge Distillation
Code Code Available 15 CLIP-guided Federated Learning on Heterogeneous and Long-Tailed Data Dec 14, 2023 Contrastive Learning Federated Learning
Code Code Available 15 CLIP-Embed-KD: Computationally Efficient Knowledge Distillation Using Embeddings as Teachers Apr 9, 2024 Knowledge Distillation Zero-shot Generalization
Code Code Available 15 CLIP-KD: An Empirical Study of CLIP Model Distillation Jul 24, 2023 Contrastive Learning Cross-Modal Retrieval
Code Code Available 15 AICSD: Adaptive Inter-Class Similarity Distillation for Semantic Segmentation Aug 8, 2023 Knowledge Distillation Semantic Segmentation
Code Code Available 15