Peer Collaborative Learning for Polyphonic Sound Event Detection Oct 7, 2021 Event Detection Knowledge Distillation
— Unverified 0Knowledge Distillation for Neural Transducers from Large Self-Supervised Pre-trained Models Oct 7, 2021 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Towards Accurate Cross-Domain In-Bed Human Pose Estimation Oct 7, 2021 Data Augmentation Knowledge Distillation
Code Code Available 1Inter-Domain Alignment for Predicting High-Resolution Brain Networks Using Teacher-Student Learning Oct 6, 2021 Decoder Domain Adaptation
Code Code Available 0Online Hyperparameter Meta-Learning with Hypergradient Distillation Oct 6, 2021 Hyperparameter Optimization Knowledge Distillation
— Unverified 0KNOT: Knowledge Distillation using Optimal Transport for Solving NLP Tasks Oct 6, 2021 Emotion Recognition Emotion Recognition in Conversation
Code Code Available 1On the Interplay Between Sparsity, Naturalness, Intelligibility, and Prosody in Speech Synthesis Oct 4, 2021 Knowledge Distillation Speech Synthesis
— Unverified 0Student Helping Teacher: Teacher Evolution via Self-Knowledge Distillation Oct 1, 2021 Knowledge Distillation Self-Knowledge Distillation
Code Code Available 0Multilingual AMR Parsing with Noisy Knowledge Distillation Sep 30, 2021 AMR Parsing Knowledge Distillation
Code Code Available 1Prune Your Model Before Distill It Sep 30, 2021 Knowledge Distillation model
Code Code Available 1Improving Neural Ranking via Lossless Knowledge Distillation Sep 30, 2021 Knowledge Distillation Learning-To-Rank
— Unverified 0Deep Neural Compression Via Concurrent Pruning and Self-Distillation Sep 30, 2021 Knowledge Distillation Language Modeling
— Unverified 0A Comprehensive Overhaul of Distilling Unconditional GANs Sep 29, 2021 Knowledge Distillation
— Unverified 0Prototypical Contrastive Predictive Coding Sep 29, 2021 Contrastive Learning Knowledge Distillation
— Unverified 0Self-supervised Models are Good Teaching Assistants for Vision Transformers Sep 29, 2021 Image Classification Knowledge Distillation
— Unverified 0A Unified Knowledge Distillation Framework for Deep Directed Graphical Models Sep 29, 2021 Continual Learning Federated Learning
— Unverified 0Not All Regions are Worthy to be Distilled: Region-aware Knowledge Distillation Towards Efficient Image-to-Image Translation Sep 29, 2021 All Contrastive Learning
— Unverified 0Explaining Knowledge Graph Embedding via Latent Rule Learning Sep 29, 2021 Graph Embedding Knowledge Distillation
— Unverified 0Adaptive Label Smoothing with Self-Knowledge Sep 29, 2021 Knowledge Distillation Machine Translation
— Unverified 0Automated Channel Pruning with Learned Importance Sep 29, 2021 Denoising GPU
— Unverified 0Distilling GANs with Style-Mixed Triplets for X2I Translation with Limited Data Sep 29, 2021 Image Generation Knowledge Distillation
— Unverified 0Stingy Teacher: Sparse Logits Suffice to Fail Knowledge Distillation Sep 29, 2021 Knowledge Distillation
— Unverified 0MOBA: Multi-teacher Model Based Reinforcement Learning Sep 29, 2021 Decision Making Knowledge Distillation
— Unverified 0Fast and Efficient Once-For-All Networks for Diverse Hardware Deployment Sep 29, 2021 All GPU
— Unverified 0Generate, Annotate, and Learn: Generative Models Advance Self-Training and Knowledge Distillation Sep 29, 2021 Few-Shot Learning Knowledge Distillation
— Unverified 0Neural Architecture Search via Ensemble-based Knowledge Distillation Sep 29, 2021 Diversity Knowledge Distillation
— Unverified 0Wakening Past Concepts without Past Data: Class-incremental Learning from Placebos Sep 29, 2021 class-incremental learning Class Incremental Learning
— Unverified 0Understanding the Success of Knowledge Distillation -- A Data Augmentation Perspective Sep 29, 2021 Active Learning Data Augmentation
— Unverified 0Learning Efficient Image Super-Resolution Networks via Structure-Regularized Pruning Sep 29, 2021 Image Super-Resolution Knowledge Distillation
— Unverified 0Representation Consolidation from Multiple Expert Teachers Sep 29, 2021 Knowledge Distillation
— Unverified 0Self-Slimming Vision Transformer Sep 29, 2021 Knowledge Distillation
— Unverified 0Self-Distilled Pruning Of Neural Networks Sep 29, 2021 Knowledge Distillation Language Modeling
— Unverified 0SeqPATE: Differentially Private Text Generation via Knowledge Distillation Sep 29, 2021 Knowledge Distillation Sentence
— Unverified 0Reducing the Teacher-Student Gap via Adaptive Temperatures Sep 29, 2021 Knowledge Distillation
— Unverified 0Source-Target Unified Knowledge Distillation for Memory-Efficient Federated Domain Adaptation on Edge Devices Sep 29, 2021 Domain Adaptation Knowledge Distillation
— Unverified 0Pseudo Knowledge Distillation: Towards Learning Optimal Instance-specific Label Smoothing Regularization Sep 29, 2021 image-classification Image Classification
— Unverified 0Feature Kernel Distillation Sep 29, 2021 image-classification Image Classification
— Unverified 0Scaling Fair Learning to Hundreds of Intersectional Groups Sep 29, 2021 Attribute Fairness
— Unverified 0Exploiting Knowledge Distillation for Few-Shot Image Generation Sep 29, 2021 Diversity Image Generation
— Unverified 0To Smooth or not to Smooth? On Compatibility between Label Smoothing and Knowledge Distillation Sep 29, 2021 image-classification Image Classification
— Unverified 0Convolutional Neural Network Compression through Generalized Kronecker Product Decomposition Sep 29, 2021 image-classification Image Classification
— Unverified 0Deep Structured Instance Graph for Distilling Object Detectors Sep 27, 2021 Instance Segmentation Knowledge Distillation
Code Code Available 1Improving Question Answering Performance Using Knowledge Distillation and Active Learning Sep 26, 2021 Active Learning Knowledge Distillation
Code Code Available 0Partial to Whole Knowledge Distillation: Progressive Distilling Decomposed Knowledge Boosts Student Better Sep 26, 2021 Knowledge Distillation
— Unverified 0Dynamic Knowledge Distillation for Pre-trained Language Models Sep 23, 2021 Knowledge Distillation
Code Code Available 1Recent Advances of Continual Learning in Computer Vision: An Overview Sep 23, 2021 Continual Learning Knowledge Distillation
— Unverified 0Low-Latency Incremental Text-to-Speech Synthesis with Distilled Context Prediction Network Sep 22, 2021 Knowledge Distillation Language Modeling
— Unverified 0The NiuTrans Machine Translation Systems for WMT21 Sep 22, 2021 Knowledge Distillation Machine Translation
— Unverified 0K-AID: Enhancing Pre-trained Language Models with Domain Knowledge for Question Answering Sep 22, 2021 CPU Knowledge Distillation
— Unverified 0KD-VLP: Improving End-to-End Vision-and-Language Pretraining with Object Knowledge Distillation Sep 22, 2021 cross-modal alignment Knowledge Distillation
Code Code Available 0