MSE-Optimal Neural Network Initialization via Layer Fusion Jan 28, 2020 General Classification Knowledge Distillation
Code Code Available 0Developing Multi-Task Recommendations with Long-Term Rewards via Policy Distilled Reinforcement Learning Jan 27, 2020 Deep Reinforcement Learning Knowledge Distillation
— Unverified 0Generation-Distillation for Efficient Natural Language Understanding in Low-Data Settings Jan 25, 2020 General Classification Knowledge Distillation
— Unverified 0Data Techniques For Online End-to-end Speech Recognition Jan 24, 2020 Data Augmentation Domain Adaptation
— Unverified 0Lightweight 3D Human Pose Estimation Network Training Using Teacher-Student Learning Jan 15, 2020 3D Human Pose Estimation 3D Pose Estimation
— Unverified 0A "Network Pruning Network" Approach to Deep Model Compression Jan 15, 2020 Knowledge Distillation Model Compression
— Unverified 0Uncertainty-Aware Multi-Shot Knowledge Distillation for Image-Based Object Re-Identification Jan 15, 2020 Knowledge Distillation Object
— Unverified 0Noisy Machines: Understanding Noisy Neural Networks and Enhancing Robustness to Analog Hardware Errors Using Distillation Jan 14, 2020 Knowledge Distillation
— Unverified 0AdaBERT: Task-Adaptive BERT Compression with Differentiable Neural Architecture Search Jan 13, 2020 Knowledge Distillation Neural Architecture Search
Code Code Available 0Unpaired Multi-modal Segmentation via Knowledge Distillation Jan 6, 2020 Image Segmentation Knowledge Distillation
Code Code Available 1Learning From Multiple Experts: Self-paced Knowledge Distillation for Long-tailed Classification Jan 6, 2020 General Classification Knowledge Distillation
Code Code Available 1Learning Task-Agnostic Embedding of Multiple Black-Box Experts for Multi-Task Model Fusion Jan 1, 2020 Knowledge Distillation
— Unverified 0Modeling Teacher-Student Techniques in Deep Neural Networks for Knowledge Distillation Dec 31, 2019 Knowledge Distillation
— Unverified 0DeGAN : Data-Enriching GAN for Retrieving Representative Samples from a Trained Classifier Dec 27, 2019 Data-free Knowledge Distillation Incremental Learning
— Unverified 0Data-Free Adversarial Distillation Dec 23, 2019 Knowledge Distillation Model Compression
Code Code Available 0The State of Knowledge Distillation for Classification Dec 20, 2019 Classification Data Augmentation
Code Code Available 0Joint Architecture and Knowledge Distillation in CNN for Chinese Text Recognition Dec 17, 2019 Handwritten Chinese Text Recognition Knowledge Distillation
— Unverified 0Iterative Dual Domain Adaptation for Neural Machine Translation Dec 16, 2019 Domain Adaptation Knowledge Distillation
— Unverified 0Explaining Sequence-Level Knowledge Distillation as Data-Augmentation for Neural Machine Translation Dec 6, 2019 Data Augmentation Knowledge Distillation
— Unverified 0Acquiring Knowledge from Pre-trained Model to Neural Machine Translation Dec 4, 2019 General Knowledge Knowledge Distillation
— Unverified 0QUEST: Quantized embedding space for transferring knowledge Dec 3, 2019 Knowledge Distillation
Code Code Available 0Efficient Convolutional Neural Networks for Depth-Based Multi-Person Pose Estimation Dec 2, 2019 2D Pose Estimation Domain Adaptation
— Unverified 0Online Knowledge Distillation with Diverse Peers Dec 1, 2019 Knowledge Distillation Transfer Learning
Code Code Available 0Random Path Selection for Continual Learning Dec 1, 2019 Continual Learning Incremental Learning
Code Code Available 0Knowledge Extraction with No Observable Data Dec 1, 2019 Data-free Knowledge Distillation Knowledge Distillation
Code Code Available 0Distributed Soft Actor-Critic with Multivariate Reward Representation and Knowledge Distillation Nov 29, 2019 Knowledge Distillation reinforcement-learning
Code Code Available 0Towards Oracle Knowledge Distillation with Neural Architecture Search Nov 29, 2019 image-classification Image Classification
— Unverified 0Blockwisely Supervised Neural Architecture Search with Knowledge Distillation Nov 29, 2019 Knowledge Distillation Neural Architecture Search
Code Code Available 1QKD: Quantization-aware Knowledge Distillation Nov 28, 2019 Knowledge Distillation Quantization
— Unverified 0Data-Driven Compression of Convolutional Neural Networks Nov 28, 2019 Knowledge Distillation Model Compression
— Unverified 0Hearing Lips: Improving Lip Reading by Distilling Speech Recognizers Nov 26, 2019 Knowledge Distillation Lipreading
— Unverified 0Go From the General to the Particular: Multi-Domain Translation with Domain Transformation Networks Nov 22, 2019 Decoder General Knowledge
Code Code Available 1Few Shot Network Compression via Cross Distillation Nov 21, 2019 Knowledge Distillation Model Compression
Code Code Available 0Search to Distill: Pearls are Everywhere but not the Eyes Nov 20, 2019 Ensemble Learning Face Recognition
— Unverified 0Neural Network Pruning with Residual-Connections and Limited-Data Nov 19, 2019 Knowledge Distillation Network Pruning
Code Code Available 0Towards Making Deep Transfer Learning Never Hurt Nov 18, 2019 All Knowledge Distillation
— Unverified 0Preparing Lessons: Improve Knowledge Distillation with Better Supervision Nov 18, 2019 Knowledge Distillation
Code Code Available 1Maintaining Discrimination and Fairness in Class Incremental Learning Nov 16, 2019 class-incremental learning Class Incremental Learning
Code Code Available 1Data Efficient Stagewise Knowledge Distillation Nov 15, 2019 Knowledge Distillation Model Compression
Code Code Available 0Knowledge Representing: Efficient, Sparse Representation of Prior Knowledge for Knowledge Distillation Nov 13, 2019 Image Classification Knowledge Distillation
— Unverified 0Learning from a Teacher using Unlabeled Data Nov 13, 2019 Knowledge Distillation Model Compression
Code Code Available 1Collaborative Distillation for Top-N Recommendation Nov 13, 2019 Collaborative Filtering Knowledge Distillation
— Unverified 0Knowledge Distillation in Document Retrieval Nov 11, 2019 Knowledge Distillation Retrieval
— Unverified 0Graph Representation Learning via Multi-task Knowledge Distillation Nov 11, 2019 Graph Representation Learning Knowledge Distillation
— Unverified 0Scalable Zero-shot Entity Linking with Dense Entity Retrieval Nov 10, 2019 Entity Embeddings Entity Linking
Code Code Available 2MKD: a Multi-Task Knowledge Distillation Approach for Pretrained Language Models Nov 9, 2019 Knowledge Distillation Multi-Task Learning
— Unverified 0Knowledge Distillation for Incremental Learning in Semantic Segmentation Nov 8, 2019 image-classification Image Classification
— Unverified 0Deep geometric knowledge distillation with graphs Nov 8, 2019 Knowledge Distillation
Code Code Available 0Microsoft Research Asia's Systems for WMT19 Nov 7, 2019 Data Augmentation Knowledge Distillation
— Unverified 0Teacher-Student Training for Robust Tacotron-based TTS Nov 7, 2019 Decoder Knowledge Distillation
— Unverified 0