Class-Balanced Distillation for Long-Tailed Visual Recognition Apr 12, 2021 Image Classification Knowledge Distillation
Code Code Available 1Dual Discriminator Adversarial Distillation for Data-free Model Compression Apr 12, 2021 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 0Data-Free Knowledge Distillation with Soft Targeted Transfer Set Synthesis Apr 10, 2021 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 0Towards Enabling Meta-Learning from Target Models Apr 8, 2021 Few-Shot Learning Inductive Bias
Code Code Available 0GKD: Semi-supervised Graph Knowledge Distillation for Graph-Independent Inference Apr 8, 2021 Disease Prediction graph construction
Code Code Available 0Distilling and Transferring Knowledge via cGAN-generated Samples for Image Classification and Regression Apr 7, 2021 General Classification image-classification
Code Code Available 0Content-Aware GAN Compression Apr 6, 2021 Image Generation Image Manipulation
Code Code Available 1Compressing Visual-linguistic Model via Knowledge Distillation Apr 5, 2021 Image Captioning Knowledge Distillation
— Unverified 0Knowledge Distillation For Wireless Edge Learning Apr 3, 2021 Cloud Computing Federated Learning
Code Code Available 0Topic Modeling for Maternal Health Using Reddit Apr 1, 2021 Knowledge Distillation
— Unverified 0Dialect Identification through Adversarial Learning and Knowledge Distillation on Romanian BERT Apr 1, 2021 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Decentralized and Model-Free Federated Learning: Consensus-Based Distillation in Function Space Apr 1, 2021 Federated Learning Knowledge Distillation
— Unverified 0Unsupervised Domain Expansion for Visual Categorization Apr 1, 2021 Domain Adaptation Knowledge Distillation
Code Code Available 0Students are the Best Teacher: Exit-Ensemble Distillation with Multi-Exits Apr 1, 2021 Classification General Classification
Code Code Available 0Is Label Smoothing Truly Incompatible with Knowledge Distillation: An Empirical Study Apr 1, 2021 image-classification Image Classification
— Unverified 0Knowledge Distillation By Sparse Representation Matching Mar 31, 2021 Knowledge Distillation Representation Learning
Code Code Available 0Fixing the Teacher-Student Knowledge Discrepancy in Distillation Mar 31, 2021 image-classification Image Classification
— Unverified 0HAD-Net: A Hierarchical Adversarial Knowledge Distillation Network for Improved Enhanced Tumour Segmentation Without Post-Contrast Images Mar 30, 2021 Knowledge Distillation Segmentation
Code Code Available 1Complementary Relation Contrastive Distillation Mar 29, 2021 Knowledge Distillation Relation
Code Code Available 1Industry Scale Semi-Supervised Learning for Natural Language Understanding Mar 29, 2021 intent-classification Intent Classification
— Unverified 0Distilling Virtual Examples for Long-tailed Recognition Mar 28, 2021 Knowledge Distillation Long-tail Learning
Code Code Available 0Embedding Transfer with Label Relaxation for Improved Metric Learning Mar 27, 2021 Knowledge Distillation Metric Learning
Code Code Available 1KnowRU: Knowledge Reusing via Knowledge Distillation in Multi-agent Reinforcement Learning Mar 27, 2021 Deep Reinforcement Learning Knowledge Distillation
— Unverified 0Distilling a Powerful Student Model via Online Knowledge Distillation Mar 26, 2021 Knowledge Distillation
Code Code Available 1Multimodal Knowledge Expansion Mar 26, 2021 Denoising Knowledge Distillation
Code Code Available 1A Practical Survey on Faster and Lighter Transformers Mar 26, 2021 Knowledge Distillation Survey
— Unverified 0Distilling Object Detectors via Decoupled Features Mar 26, 2021 image-classification Image Classification
Code Code Available 1Hands-on Guidance for Distilling Object Detectors Mar 26, 2021 Knowledge Distillation Object
— Unverified 0Leaning Compact and Representative Features for Cross-Modality Person Re-Identification Mar 26, 2021 Cross-Modality Person Re-identification Knowledge Distillation
Code Code Available 0Weakly-Supervised Domain Adaptation of Deep Regression Trackers via Reinforced Knowledge Distillation Mar 26, 2021 Domain Adaptation Knowledge Distillation
— Unverified 0Pruning-then-Expanding Model for Domain Adaptation of Neural Machine Translation Mar 25, 2021 Domain Adaptation Knowledge Distillation
Code Code Available 1Spirit Distillation: Precise Real-time Semantic Segmentation of Road Scenes with Insufficient Data Mar 25, 2021 Autonomous Driving Few-Shot Learning
— Unverified 0The NLP Cookbook: Modern Recipes for Transformer based Deep Learning Architectures Mar 23, 2021 Information Retrieval Knowledge Distillation
— Unverified 0Student Network Learning via Evolutionary Knowledge Distillation Mar 23, 2021 Knowledge Distillation Transfer Learning
— Unverified 0Balanced softmax cross-entropy for incremental learning with and without memory Mar 23, 2021 class-incremental learning Class Incremental Learning
— Unverified 0ROSITA: Refined BERT cOmpreSsion with InTegrAted techniques Mar 21, 2021 Knowledge Distillation
Code Code Available 1Compacting Deep Neural Networks for Internet of Things: Methods and Applications Mar 20, 2021 Diversity Knowledge Distillation
— Unverified 0Variational Knowledge Distillation for Disease Classification in Chest X-Rays Mar 19, 2021 Classification General Classification
— Unverified 0Online Lifelong Generalized Zero-Shot Learning Mar 19, 2021 Continual Learning Generalized Zero-Shot Learning
Code Code Available 0Cost-effective Deployment of BERT Models in Serverless Environment Mar 19, 2021 Knowledge Distillation Semantic Textual Similarity
— Unverified 0Self-Supervised Adaptation for Video Super-Resolution Mar 18, 2021 Image Super-Resolution Knowledge Distillation
Code Code Available 1Human-Inspired Multi-Agent Navigation using Knowledge Distillation Mar 18, 2021 Collision Avoidance Knowledge Distillation
Code Code Available 1Similarity Transfer for Knowledge Distillation Mar 18, 2021 Knowledge Distillation
— Unverified 0Transformer-based ASR Incorporating Time-reduction Layer and Fine-tuning with Self-Knowledge Distillation Mar 17, 2021 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Leveraging Recent Advances in Deep Learning for Audio-Visual Emotion Recognition Mar 16, 2021 Deep Learning Emotion Recognition
— Unverified 0Robustly Optimized and Distilled Training for Natural Language Understanding Mar 16, 2021 Knowledge Distillation Machine Reading Comprehension
— Unverified 0Refine Myself by Teaching Myself: Feature Refinement via Self-Knowledge Distillation Mar 15, 2021 Data Augmentation Knowledge Distillation
Code Code Available 1Robust Model Compression Using Deep Hypotheses Mar 13, 2021 Binary Classification Knowledge Distillation
Code Code Available 0A New Training Framework for Deep Neural Network Mar 12, 2021 Knowledge Distillation
— Unverified 0Beyond Self-Supervision: A Simple Yet Effective Network Distillation Alternative to Improve Backbones Mar 10, 2021 Knowledge Distillation object-detection
Code Code Available 1