Can LLM Watermarks Robustly Prevent Unauthorized Knowledge Distillation? Feb 17, 2025 Knowledge Distillation Language Modeling
Code Code Available 1Distilling a Powerful Student Model via Online Knowledge Distillation Mar 26, 2021 Knowledge Distillation
Code Code Available 1Lightweight Transformers for Clinical Natural Language Processing Feb 9, 2023 Continual Learning Knowledge Distillation
Code Code Available 1A Discrepancy Aware Framework for Robust Anomaly Detection Oct 11, 2023 Anomaly Detection Decoder
Code Code Available 1Distilled Semantics for Comprehensive Scene Understanding from Videos Mar 31, 2020 Depth Estimation Knowledge Distillation
Code Code Available 1Distilling Audio-Visual Knowledge by Compositional Contrastive Learning Apr 22, 2021 Audio Tagging audio-visual learning
Code Code Available 1Camera clustering for scalable stream-based active distillation Apr 16, 2024 Clustering Knowledge Distillation
Code Code Available 1Contrastive Deep Supervision Jul 12, 2022 Contrastive Learning Fine-Grained Image Classification
Code Code Available 1Anomaly Detection in Video via Self-Supervised and Multi-Task Learning Nov 15, 2020 Abnormal Event Detection In Video Anomaly Detection
Code Code Available 1Decomposed Knowledge Distillation for Class-Incremental Semantic Segmentation Oct 12, 2022 Class-Incremental Semantic Segmentation Knowledge Distillation
Code Code Available 1DistillBEV: Boosting Multi-Camera 3D Object Detection with Cross-Modal Knowledge Distillation Sep 26, 2023 3D Object Detection Autonomous Driving
Code Code Available 1Contrastive Model Inversion for Data-Free Knowledge Distillation May 18, 2021 Contrastive Learning Data-free Knowledge Distillation
Code Code Available 1Contrastive Representation Distillation Oct 23, 2019 Contrastive Learning Knowledge Distillation
Code Code Available 1AutoGAN-Distiller: Searching to Compress Generative Adversarial Networks Jun 15, 2020 AutoML Knowledge Distillation
Code Code Available 1Distilling Autoregressive Models to Obtain High-Performance Non-Autoregressive Solvers for Vehicle Routing Problems with Faster Inference Speed Dec 19, 2023 Knowledge Distillation
Code Code Available 1LQER: Low-Rank Quantization Error Reconstruction for LLMs Feb 4, 2024 Knowledge Distillation Quantization
Code Code Available 1Distilling Object Detectors via Decoupled Features Mar 26, 2021 image-classification Image Classification
Code Code Available 1Deep Graph-level Anomaly Detection by Glocal Knowledge Distillation Dec 19, 2021 Anomaly Detection Knowledge Distillation
Code Code Available 1DialoKG: Knowledge-Structure Aware Task-Oriented Dialogue Generation Apr 19, 2022 Dialogue Generation Knowledge Distillation
Code Code Available 1Mask-invariant Face Recognition through Template-level Knowledge Distillation Dec 10, 2021 Face Recognition Knowledge Distillation
Code Code Available 1mCLIP: Multilingual CLIP via Cross-lingual Transfer Jul 10, 2023 Contrastive Learning Cross-Lingual Transfer
Code Code Available 1MDFlow: Unsupervised Optical Flow Learning by Reliable Mutual Knowledge Distillation Nov 11, 2022 Blocking Data Augmentation
Code Code Available 1MEAL V2: Boosting Vanilla ResNet-50 to 80%+ Top-1 Accuracy on ImageNet without Tricks Sep 17, 2020 Image Classification Knowledge Distillation
Code Code Available 1ME-D2N: Multi-Expert Domain Decompositional Network for Cross-Domain Few-Shot Learning Oct 11, 2022 Cross-Domain Few-Shot cross-domain few-shot learning
Code Code Available 1Distillation and Refinement of Reasoning in Small Language Models for Document Re-ranking Apr 4, 2025 Document Ranking Information Retrieval
Code Code Available 1Meta-DMoE: Adapting to Domain Shift by Meta-Distillation from Mixture-of-Experts Oct 8, 2022 Domain Generalization Knowledge Distillation
Code Code Available 1Meta-Learning based Degradation Representation for Blind Super-Resolution Jul 28, 2022 Blind Super-Resolution Knowledge Distillation
Code Code Available 1BERT Learns to Teach: Knowledge Distillation with Meta Learning Jun 8, 2021 Knowledge Distillation Meta-Learning
Code Code Available 1CaMEL: Mean Teacher Learning for Image Captioning Feb 21, 2022 Image Captioning Knowledge Distillation
Code Code Available 1MetricGAN-OKD: Multi-Metric Optimization of MetricGAN via Online Knowledge Distillation for Speech Enhancement Jul 24, 2023 Knowledge Distillation Speech Enhancement
Code Code Available 1Distillation-Based Training for Multi-Exit Architectures Oct 1, 2019 Knowledge Distillation
Code Code Available 1CaKDP: Category-aware Knowledge Distillation and Pruning Framework for Lightweight 3D Object Detection Jan 1, 2024 3D Object Detection Knowledge Distillation
Code Code Available 1DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter Oct 2, 2019 Hate Speech Detection Knowledge Distillation
Code Code Available 1DeepKD: A Deeply Decoupled and Denoised Knowledge Distillation Trainer May 21, 2025 Denoising Knowledge Distillation
Code Code Available 1MixSKD: Self-Knowledge Distillation from Mixup for Image Recognition Aug 11, 2022 Data Augmentation image-classification
Code Code Available 1ML-Doctor: Holistic Risk Assessment of Inference Attacks Against Machine Learning Models Feb 4, 2021 Attribute BIG-bench Machine Learning
Code Code Available 1Distilling Knowledge from Graph Convolutional Networks Mar 23, 2020 Knowledge Distillation Transfer Learning
Code Code Available 1MobileIQA: Exploiting Mobile-level Diverse Opinion Network For No-Reference Image Quality Assessment Using Knowledge Distillation Sep 2, 2024 Computational Efficiency Image Quality Assessment
Code Code Available 1A Knowledge Distillation Framework For Enhancing Ear-EEG Based Sleep Staging With Scalp-EEG Data Oct 27, 2022 Domain Adaptation EEG
Code Code Available 1Modality-Balanced Learning for Multimedia Recommendation Jul 26, 2024 Collaborative Filtering counterfactual
Code Code Available 1MonoSKD: General Distillation Framework for Monocular 3D Object Detection via Spearman Correlation Coefficient Oct 17, 2023 3D Object Detection GPU
Code Code Available 1MonoTAKD: Teaching Assistant Knowledge Distillation for Monocular 3D Object Detection Apr 7, 2024 3D Object Detection Autonomous Driving
Code Code Available 1Mosaicking to Distill: Knowledge Distillation from Out-of-Domain Data Oct 27, 2021 Knowledge Distillation Semantic Segmentation
Code Code Available 1MPCFormer: fast, performant and private Transformer inference with MPC Nov 2, 2022 Knowledge Distillation
Code Code Available 1DistilCSE: Effective Knowledge Distillation For Contrastive Sentence Embeddings Dec 10, 2021 Contrastive Learning Knowledge Distillation
Code Code Available 1Creating Something from Nothing: Unsupervised Knowledge Distillation for Cross-Modal Hashing Apr 1, 2020 Knowledge Distillation Retrieval
Code Code Available 1Multi-Label Knowledge Distillation Aug 12, 2023 Binary Classification Knowledge Distillation
Code Code Available 1Multi-Level Branched Regularization for Federated Learning Jul 14, 2022 Federated Learning Knowledge Distillation
Code Code Available 1Multimodal and multiview distillation for real-time player detection on a football field Apr 16, 2020 Data Augmentation Knowledge Distillation
Code Code Available 1DisCo: Distilled Student Models Co-training for Semi-supervised Text Mining May 20, 2023 Extractive Summarization Knowledge Distillation
Code Code Available 1