Pro-KD: Progressive Distillation by Following the Footsteps of the Teacher Oct 16, 2021 image-classification Image Classification
— Unverified 0Robustness Challenges in Model Distillation and Pruning for Natural Language Understanding Oct 16, 2021 Knowledge Distillation Model Compression
— Unverified 0A Short Study on Compressing Decoder-Based Language Models Oct 16, 2021 Decoder Knowledge Distillation
— Unverified 0Know your tools well: Better and faster QA with synthetic examples Oct 16, 2021 Diversity Knowledge Distillation
— Unverified 0Sparse Progressive Distillation: Resolving Overfitting under Pretrain-and-Finetune Paradigm Oct 15, 2021 Knowledge Distillation
— Unverified 0From Multimodal to Unimodal Attention in Transformers using Knowledge Distillation Oct 15, 2021 Knowledge Distillation Multimodal Deep Learning
— Unverified 0Multilingual Neural Machine Translation:Can Linguistic Hierarchies Help? Oct 15, 2021 Knowledge Distillation Machine Translation
— Unverified 0Kronecker Decomposition for GPT Compression Oct 15, 2021 Knowledge Distillation Language Modeling
— Unverified 0Language Modelling via Learning to Rank Oct 13, 2021 Knowledge Distillation Language Modelling
— Unverified 0False Negative Distillation and Contrastive Learning for Personalized Outfit Recommendation Oct 13, 2021 Contrastive Learning Data Augmentation
— Unverified 0CONetV2: Efficient Auto-Channel Size Optimization for CNNs Oct 13, 2021 Knowledge Distillation Neural Architecture Search
Code Code Available 0Compact CNN Models for On-device Ocular-based User Recognition in Mobile Devices Oct 11, 2021 Knowledge Distillation Network Pruning
— Unverified 0Rectifying the Data Bias in Knowledge Distillation Oct 11, 2021 Face Recognition Face Verification
— Unverified 0Towards Streaming Egocentric Action Anticipation Oct 11, 2021 Action Anticipation Knowledge Distillation
— Unverified 0Towards Data-Free Domain Generalization Oct 9, 2021 Data-free Knowledge Distillation Domain Generalization
Code Code Available 0Visualizing the embedding space to explain the effect of knowledge distillation Oct 9, 2021 Knowledge Distillation
— Unverified 0Cross-modal Knowledge Distillation for Vision-to-Sensor Action Recognition Oct 8, 2021 Action Recognition Activity Recognition
Code Code Available 0Knowledge Distillation for Neural Transducers from Large Self-Supervised Pre-trained Models Oct 7, 2021 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Peer Collaborative Learning for Polyphonic Sound Event Detection Oct 7, 2021 Event Detection Knowledge Distillation
— Unverified 0Online Hyperparameter Meta-Learning with Hypergradient Distillation Oct 6, 2021 Hyperparameter Optimization Knowledge Distillation
— Unverified 0Inter-Domain Alignment for Predicting High-Resolution Brain Networks Using Teacher-Student Learning Oct 6, 2021 Decoder Domain Adaptation
Code Code Available 0On the Interplay Between Sparsity, Naturalness, Intelligibility, and Prosody in Speech Synthesis Oct 4, 2021 Knowledge Distillation Speech Synthesis
— Unverified 0Student Helping Teacher: Teacher Evolution via Self-Knowledge Distillation Oct 1, 2021 Knowledge Distillation Self-Knowledge Distillation
Code Code Available 0Deep Neural Compression Via Concurrent Pruning and Self-Distillation Sep 30, 2021 Knowledge Distillation Language Modeling
— Unverified 0Improving Neural Ranking via Lossless Knowledge Distillation Sep 30, 2021 Knowledge Distillation Learning-To-Rank
— Unverified 0Automated Channel Pruning with Learned Importance Sep 29, 2021 Denoising GPU
— Unverified 0Distilling GANs with Style-Mixed Triplets for X2I Translation with Limited Data Sep 29, 2021 Image Generation Knowledge Distillation
— Unverified 0Explaining Knowledge Graph Embedding via Latent Rule Learning Sep 29, 2021 Graph Embedding Knowledge Distillation
— Unverified 0SeqPATE: Differentially Private Text Generation via Knowledge Distillation Sep 29, 2021 Knowledge Distillation Sentence
— Unverified 0Not All Regions are Worthy to be Distilled: Region-aware Knowledge Distillation Towards Efficient Image-to-Image Translation Sep 29, 2021 All Contrastive Learning
— Unverified 0Scaling Fair Learning to Hundreds of Intersectional Groups Sep 29, 2021 Attribute Fairness
— Unverified 0Self-Slimming Vision Transformer Sep 29, 2021 Knowledge Distillation
— Unverified 0Convolutional Neural Network Compression through Generalized Kronecker Product Decomposition Sep 29, 2021 image-classification Image Classification
— Unverified 0Stingy Teacher: Sparse Logits Suffice to Fail Knowledge Distillation Sep 29, 2021 Knowledge Distillation
— Unverified 0Generate, Annotate, and Learn: Generative Models Advance Self-Training and Knowledge Distillation Sep 29, 2021 Few-Shot Learning Knowledge Distillation
— Unverified 0To Smooth or not to Smooth? On Compatibility between Label Smoothing and Knowledge Distillation Sep 29, 2021 image-classification Image Classification
— Unverified 0Adaptive Label Smoothing with Self-Knowledge Sep 29, 2021 Knowledge Distillation Machine Translation
— Unverified 0Representation Consolidation from Multiple Expert Teachers Sep 29, 2021 Knowledge Distillation
— Unverified 0Source-Target Unified Knowledge Distillation for Memory-Efficient Federated Domain Adaptation on Edge Devices Sep 29, 2021 Domain Adaptation Knowledge Distillation
— Unverified 0Wakening Past Concepts without Past Data: Class-incremental Learning from Placebos Sep 29, 2021 class-incremental learning Class Incremental Learning
— Unverified 0A Unified Knowledge Distillation Framework for Deep Directed Graphical Models Sep 29, 2021 Continual Learning Federated Learning
— Unverified 0Learning Efficient Image Super-Resolution Networks via Structure-Regularized Pruning Sep 29, 2021 Image Super-Resolution Knowledge Distillation
— Unverified 0Understanding the Success of Knowledge Distillation -- A Data Augmentation Perspective Sep 29, 2021 Active Learning Data Augmentation
— Unverified 0Self-supervised Models are Good Teaching Assistants for Vision Transformers Sep 29, 2021 Image Classification Knowledge Distillation
— Unverified 0MOBA: Multi-teacher Model Based Reinforcement Learning Sep 29, 2021 Decision Making Knowledge Distillation
— Unverified 0Fast and Efficient Once-For-All Networks for Diverse Hardware Deployment Sep 29, 2021 All GPU
— Unverified 0Self-Distilled Pruning Of Neural Networks Sep 29, 2021 Knowledge Distillation Language Modeling
— Unverified 0Exploiting Knowledge Distillation for Few-Shot Image Generation Sep 29, 2021 Diversity Image Generation
— Unverified 0A Comprehensive Overhaul of Distilling Unconditional GANs Sep 29, 2021 Knowledge Distillation
— Unverified 0Reducing the Teacher-Student Gap via Adaptive Temperatures Sep 29, 2021 Knowledge Distillation
— Unverified 0