NormKD: Normalized Logits for Knowledge Distillation Aug 1, 2023 image-classification Image Classification
Code Code Available 1BearingPGA-Net: A Lightweight and Deployable Bearing Fault Diagnosis Network via Decoupled Knowledge Distillation and FPGA Acceleration Jul 31, 2023 CPU Fault Diagnosis
Code Code Available 1f-Divergence Minimization for Sequence-Level Knowledge Distillation Jul 27, 2023 Knowledge Distillation
Code Code Available 1Fitting Auditory Filterbanks with Multiresolution Neural Networks Jul 25, 2023 Inductive Bias Knowledge Distillation
Code Code Available 1MetricGAN-OKD: Multi-Metric Optimization of MetricGAN via Online Knowledge Distillation for Speech Enhancement Jul 24, 2023 Knowledge Distillation Speech Enhancement
Code Code Available 1CLIP-KD: An Empirical Study of CLIP Model Distillation Jul 24, 2023 Contrastive Learning Cross-Modal Retrieval
Code Code Available 1DPM-OT: A New Diffusion Probabilistic Model Based on Optimal Transport Jul 21, 2023 Denoising Knowledge Distillation
Code Code Available 1Reverse Knowledge Distillation: Training a Large Model using a Small One for Retinal Image Matching on Limited Data Jul 20, 2023 Image Registration Keypoint Detection
Code Code Available 1FedDefender: Client-Side Attack-Tolerant Federated Learning Jul 18, 2023 Federated Learning Knowledge Distillation
Code Code Available 1Class-relation Knowledge Distillation for Novel Class Discovery Jul 18, 2023 Knowledge Distillation Novel Class Discovery
Code Code Available 1Cumulative Spatial Knowledge Distillation for Vision Transformers Jul 17, 2023 Inductive Bias Knowledge Distillation
Code Code Available 1DARTS: Double Attention Reference-based Transformer for Super-resolution Jul 17, 2023 Image Super-Resolution Knowledge Distillation
Code Code Available 1Multimodal Distillation for Egocentric Action Recognition Jul 14, 2023 Action Recognition Knowledge Distillation
Code Code Available 1Learning to Retrieve In-Context Examples for Large Language Models Jul 14, 2023 In-Context Learning Knowledge Distillation
Code Code Available 1mCLIP: Multilingual CLIP via Cross-lingual Transfer Jul 10, 2023 Contrastive Learning Cross-Lingual Transfer
Code Code Available 1CMDFusion: Bidirectional Fusion Network with Cross-modality Knowledge Distillation for LIDAR Semantic Segmentation Jul 9, 2023 Autonomous Vehicles Knowledge Distillation
Code Code Available 1Distilling Large Vision-Language Model with Out-of-Distribution Generalizability Jul 6, 2023 Few-Shot Image Classification Image Classification
Code Code Available 1MDViT: Multi-domain Vision Transformer for Small Medical Image Segmentation Datasets Jul 5, 2023 Efficient ViTs Image Segmentation
Code Code Available 1FedDefender: Backdoor Attack Defense in Federated Learning Jul 2, 2023 Backdoor Attack Data Poisoning
Code Code Available 1Quantization Variation: A New Perspective on Training Transformers with Low-Bit Precision Jul 1, 2023 Knowledge Distillation Model Compression
Code Code Available 1Audio Embeddings as Teachers for Music Classification Jun 30, 2023 Classification Information Retrieval
Code Code Available 1NaturalInversion: Data-Free Image Synthesis Improving Real-World Consistency Jun 29, 2023 Image Generation Knowledge Distillation
Code Code Available 1Mitigating Accuracy-Robustness Trade-off via Balanced Multi-Teacher Adversarial Distillation Jun 28, 2023 Adversarial Robustness Knowledge Distillation
Code Code Available 1On information captured by neural networks: connections with memorization and generalization Jun 28, 2023 Informativeness Knowledge Distillation
Code Code Available 1Robust Spatiotemporal Traffic Forecasting with Reinforced Dynamic Adversarial Training Jun 25, 2023 Adversarial Robustness Knowledge Distillation
Code Code Available 1CrossKD: Cross-Head Knowledge Distillation for Object Detection Jun 20, 2023 Dense Object Detection Knowledge Distillation
Code Code Available 1Coaching a Teachable Student Jun 16, 2023 CARLA longest6 Knowledge Distillation
Code Code Available 1BPKD: Boundary Privileged Knowledge Distillation For Semantic Segmentation Jun 13, 2023 Knowledge Distillation Segmentation
Code Code Available 1Adaptive Multi-Teacher Knowledge Distillation with Meta-Learning Jun 11, 2023 Knowledge Distillation Meta-Learning
Code Code Available 1Are Intermediate Layers and Labels Really Necessary? A General Language Model Distillation Method Jun 11, 2023 Knowledge Distillation Language Modeling
Code Code Available 1GKD: A General Knowledge Distillation Framework for Large-scale Pre-trained Language Model Jun 11, 2023 General Knowledge Knowledge Distillation
Code Code Available 1RankFormer: Listwise Learning-to-Rank Using Listwide Labels Jun 9, 2023 Knowledge Distillation Learning-To-Rank
Code Code Available 1Self-supervised Audio Teacher-Student Transformer for Both Clip-level and Frame-level Tasks Jun 7, 2023 Audio Classification Audio Tagging
Code Code Available 1Orca: Progressive Learning from Complex Explanation Traces of GPT-4 Jun 5, 2023 Imitation Learning Knowledge Distillation
Code Code Available 1I^3 Retriever: Incorporating Implicit Interaction in Pre-trained Language Models for Passage Retrieval Jun 4, 2023 Knowledge Distillation Passage Retrieval
Code Code Available 1Revisiting Data-Free Knowledge Distillation with Poisoned Teachers Jun 4, 2023 Backdoor Defense for Data-Free Distillation with Poisoned Teachers Data-free Knowledge Distillation
Code Code Available 1PlaSma: Making Small Language Models Better Procedural Knowledge Models for (Counterfactual) Planning May 31, 2023 Common Sense Reasoning counterfactual
Code Code Available 1Semi-supervised Pathological Image Segmentation via Cross Distillation of Multiple Attentions May 30, 2023 Decoder Image Segmentation
Code Code Available 1Learning to Learn from APIs: Black-Box Data-Free Meta-Learning May 28, 2023 Few-Shot Learning Knowledge Distillation
Code Code Available 1DPHuBERT: Joint Distillation and Pruning of Self-Supervised Speech Models May 28, 2023 Knowledge Distillation Self-Supervised Learning
Code Code Available 1One-Step Knowledge Distillation and Fine-Tuning in Using Large Pre-Trained Self-Supervised Learning Models for Speaker Verification May 27, 2023 Knowledge Distillation Self-Supervised Learning
Code Code Available 1FoPro-KD: Fourier Prompted Effective Knowledge Distillation for Long-Tailed Medical Image Recognition May 27, 2023 image-classification Image Classification
Code Code Available 1Towards Better Entity Linking with Multi-View Enhanced Distillation May 27, 2023 Entity Linking Knowledge Distillation
Code Code Available 1Improving Knowledge Distillation via Regularizing Feature Norm and Direction May 26, 2023 Domain Adaptation Knowledge Distillation
Code Code Available 1OVO: Open-Vocabulary Occupancy May 25, 2023 Knowledge Distillation Prediction
Code Code Available 1Towards Higher Pareto Frontier in Multilingual Machine Translation May 25, 2023 Knowledge Distillation Machine Translation
Code Code Available 1VanillaKD: Revisit the Power of Vanilla Knowledge Distillation from Small Scale to Large Scale May 25, 2023 Data Augmentation Knowledge Distillation
Code Code Available 1Knowledge Diffusion for Distillation May 25, 2023 Denoising image-classification
Code Code Available 1How to Distill your BERT: An Empirical Study on the Impact of Weight Initialisation and Distillation Objectives May 24, 2023 Knowledge Distillation QNLI
Code Code Available 1NORM: Knowledge Distillation via N-to-One Representation Matching May 23, 2023 Knowledge Distillation
Code Code Available 1