EvDistill: Asynchronous Events to End-task Learning via Bidirectional Reconstruction-guided Cross-modal Knowledge Distillation Nov 24, 2021 Event-based Object Segmentation Knowledge Distillation
Code Code Available 15 Improving Knowledge Distillation via Category Structure Aug 1, 2020 Knowledge Distillation
Code Code Available 15 Initialization and Regularization of Factorized Neural Layers May 3, 2021 Knowledge Distillation Model Compression
Code Code Available 15 CTC-based Non-autoregressive Textless Speech-to-Speech Translation Jun 11, 2024 Knowledge Distillation Machine Translation
Code Code Available 15 AIM 2024 Challenge on UHD Blind Photo Quality Assessment Sep 24, 2024 4k Computational Efficiency
Code Code Available 15 Expanding Scene Graph Boundaries: Fully Open-vocabulary Scene Graph Generation via Visual-Concept Alignment and Retention Nov 18, 2023 Concept Alignment Graph Generation
Code Code Available 15 Evolving Search Space for Neural Architecture Search Nov 22, 2020 Knowledge Distillation Neural Architecture Search
Code Code Available 15 Contrastive Deep Supervision Jul 12, 2022 Contrastive Learning Fine-Grained Image Classification
Code Code Available 15 Contrastive Distillation on Intermediate Representations for Language Model Compression Sep 29, 2020 Knowledge Distillation Language Modeling
Code Code Available 15 Exploring Complementary Strengths of Invariant and Equivariant Representations for Few-Shot Learning Mar 1, 2021 Few-Shot Image Classification Few-Shot Learning
Code Code Available 15 Autoencoders as Cross-Modal Teachers: Can Pretrained 2D Image Transformers Help 3D Representation Learning? Dec 16, 2022 3D Point Cloud Classification Few-Shot 3D Point Cloud Classification
Code Code Available 15 Contrastive Model Inversion for Data-Free Knowledge Distillation May 18, 2021 Contrastive Learning Data-free Knowledge Distillation
Code Code Available 15 Contrastive Representation Distillation Oct 23, 2019 Contrastive Learning Knowledge Distillation
Code Code Available 15 AutoGAN-Distiller: Searching to Compress Generative Adversarial Networks Jun 15, 2020 AutoML Knowledge Distillation
Code Code Available 15 Exploring Inter-Channel Correlation for Diversity-preserved KnowledgeDistillation Feb 8, 2022 Diversity Knowledge Distillation
Code Code Available 15 Exploring Inter-Channel Correlation for Diversity-Preserved Knowledge Distillation Jan 1, 2021 Diversity Knowledge Distillation
Code Code Available 15 AdaptGuard: Defending Against Universal Attacks for Model Adaptation Mar 19, 2023 Knowledge Distillation model
Code Code Available 15 The Augmented Image Prior: Distilling 1000 Classes by Extrapolating from a Single Image Dec 1, 2021 Knowledge Distillation
Code Code Available 15 FitNets: Hints for Thin Deep Nets Dec 19, 2014 Knowledge Distillation
Code Code Available 15 Generic-to-Specific Distillation of Masked Autoencoders Feb 28, 2023 Decoder image-classification
Code Code Available 15 FairDistillation: Mitigating Stereotyping in Language Models Jul 10, 2022 Knowledge Distillation
Code Code Available 15 CaMEL: Mean Teacher Learning for Image Captioning Feb 21, 2022 Image Captioning Knowledge Distillation
Code Code Available 15 Improve Object Detection with Feature-based Knowledge Distillation: Towards Accurate and Efficient Detectors Jan 1, 2021 image-classification Image Classification
Code Code Available 15 CaKDP: Category-aware Knowledge Distillation and Pruning Framework for Lightweight 3D Object Detection Jan 1, 2024 3D Object Detection Knowledge Distillation
Code Code Available 15 FAMIE: A Fast Active Learning Framework for Multilingual Information Extraction Feb 16, 2022 Active Learning Knowledge Distillation
Code Code Available 15 Fast Neural Architecture Search of Compact Semantic Segmentation Models via Auxiliary Cells Oct 25, 2018 Depth Estimation Depth Prediction
Code Code Available 15 One Step Diffusion-based Super-Resolution with Time-Aware Distillation Aug 14, 2024 Image Super-Resolution Knowledge Distillation
Code Code Available 15 One-Step Knowledge Distillation and Fine-Tuning in Using Large Pre-Trained Self-Supervised Learning Models for Speaker Verification May 27, 2023 Knowledge Distillation Self-Supervised Learning
Code Code Available 15 Cumulative Spatial Knowledge Distillation for Vision Transformers Jul 17, 2023 Inductive Bias Knowledge Distillation
Code Code Available 15 FastFormers: Highly Efficient Transformer Models for Natural Language Understanding Oct 26, 2020 CPU GPU
Code Code Available 15 Faster ILOD: Incremental Learning for Object Detectors based on Faster RCNN Mar 9, 2020 Incremental Learning Knowledge Distillation
Code Code Available 15 Online Knowledge Distillation for Efficient Pose Estimation Aug 4, 2021 Knowledge Distillation Pose Estimation
Code Code Available 15 Improved Techniques for Training Adaptive Deep Networks Aug 17, 2019 Computational Efficiency Knowledge Distillation
Code Code Available 15 Fast-Vid2Vid: Spatial-Temporal Compression for Video-to-Video Synthesis Jul 11, 2022 GPU Knowledge Distillation
Code Code Available 15 FDCNet: Feature Drift Compensation Network for Class-Incremental Weakly Supervised Object Localization Sep 17, 2023 class-incremental learning Incremental Learning
Code Code Available 15 FCS: Feature Calibration and Separation for Non-Exemplar Class Incremental Learning Jan 1, 2024 class-incremental learning Class Incremental Learning
Code Code Available 15 Avatar Knowledge Distillation: Self-ensemble Teacher Paradigm with Uncertainty May 4, 2023 Knowledge Distillation object-detection
Code Code Available 15 On Representation Knowledge Distillation for Graph Neural Networks Nov 9, 2021 Contrastive Learning Knowledge Distillation
Code Code Available 15 A Knowledge Distillation Framework For Enhancing Ear-EEG Based Sleep Staging With Scalp-EEG Data Oct 27, 2022 Domain Adaptation EEG
Code Code Available 15 Improving Continual Relation Extraction by Distinguishing Analogous Semantics May 11, 2023 Continual Relation Extraction Knowledge Distillation
Code Code Available 15 Implicit Chain of Thought Reasoning via Knowledge Distillation Nov 2, 2023 Knowledge Distillation Math
Code Code Available 15 FedACK: Federated Adversarial Contrastive Knowledge Distillation for Cross-Lingual and Cross-Model Social Bot Detection Mar 10, 2023 Contrastive Learning Knowledge Distillation
Code Code Available 15 FedDAT: An Approach for Foundation Model Finetuning in Multi-Modal Heterogeneous Federated Learning Aug 21, 2023 Federated Learning Knowledge Distillation
Code Code Available 15 FedCL: Federated Multi-Phase Curriculum Learning to Synchronously Correlate User Heterogeneity Nov 14, 2022 Federated Learning Knowledge Distillation
Code Code Available 15 Curriculum Learning for Dense Retrieval Distillation Apr 28, 2022 Knowledge Distillation Passage Retrieval
Code Code Available 15 Creating Something from Nothing: Unsupervised Knowledge Distillation for Cross-Modal Hashing Apr 1, 2020 Knowledge Distillation Retrieval
Code Code Available 15 Federated Knowledge Distillation Nov 4, 2020 Federated Learning Knowledge Distillation
Code Code Available 15 KNOT: Knowledge Distillation using Optimal Transport for Solving NLP Tasks Oct 6, 2021 Emotion Recognition Emotion Recognition in Conversation
Code Code Available 15 FedMD: Heterogenous Federated Learning via Model Distillation Oct 8, 2019 Federated Learning Knowledge Distillation
Code Code Available 15 Improve Cross-Architecture Generalization on Dataset Distillation Feb 20, 2024 Dataset Distillation Knowledge Distillation
Code Code Available 15