Knowledge Distillation for Mobile Edge Computation Offloading Apr 9, 2020 Imitation Learning Knowledge Distillation
— Unverified 0LadaBERT: Lightweight Adaptation of BERT through Hybrid Model Compression Apr 8, 2020 Blocking Knowledge Distillation
— Unverified 0Towards Non-task-specific Distillation of BERT via Sentence Representation Approximation Apr 7, 2020 Knowledge Distillation Sentence
— Unverified 0Enhancing Review Comprehension with Domain-Specific Commonsense Apr 6, 2020 Aspect Extraction Knowledge Distillation
— Unverified 0Knowledge as Priors: Cross-Modal Knowledge Generalization for Datasets without Superior Knowledge Apr 1, 2020 3D Hand Pose Estimation Hand Pose Estimation
— Unverified 0Spatio-Temporal Graph for Video Captioning with Knowledge Distillation Mar 31, 2020 Knowledge Distillation Object
— Unverified 0SS-IL: Separated Softmax for Incremental Learning Mar 31, 2020 class-incremental learning Class Incremental Learning
— Unverified 0Analysis of Knowledge Transfer in Kernel Regime Mar 30, 2020 Knowledge Distillation Transfer Learning
— Unverified 0Squeezed Deep 6DoF Object Detection Using Knowledge Distillation Mar 30, 2020 Knowledge Distillation Object
Code Code Available 0Synergic Adversarial Label Learning for Grading Retinal Diseases via Knowledge Distillation and Multi-task Learning Mar 24, 2020 Classification General Classification
— Unverified 0A Survey of Methods for Low-Power Deep Learning and Computer Vision Mar 24, 2020 Knowledge Distillation Quantization
— Unverified 0Teacher-Student chain for efficient semi-supervised histology image classification Mar 17, 2020 Classification General Classification
— Unverified 0Knowledge distillation via adaptive instance normalization Mar 9, 2020 Knowledge Distillation Model Compression
— Unverified 0Pacemaker: Intermediate Teacher Knowledge Distillation For On-The-Fly Convolutional Neural Network Mar 9, 2020 Knowledge Distillation Model Compression
— Unverified 0Explaining Knowledge Distillation by Quantifying the Knowledge Mar 7, 2020 Knowledge Distillation
— Unverified 0Distilling portable Generative Adversarial Networks for Image Translation Mar 7, 2020 Image-to-Image Translation Knowledge Distillation
— Unverified 0Distill, Adapt, Distill: Training Small, In-Domain Models for Neural Machine Translation Mar 5, 2020 Domain Adaptation Knowledge Distillation
— Unverified 0An Efficient Method of Training Small Models for Regression Problems with Knowledge Distillation Feb 28, 2020 Knowledge Distillation Memorization
— Unverified 0Residual Knowledge Distillation Feb 21, 2020 Knowledge Distillation Model Compression
— Unverified 0Balancing Cost and Benefit with Tied-Multi Transformers Feb 20, 2020 Decoder Knowledge Distillation
— Unverified 0The Microsoft Toolkit of Multi-Task Deep Neural Networks for Natural Language Understanding Feb 19, 2020 Knowledge Distillation Multi-Task Learning
— Unverified 0Self-Distillation Amplifies Regularization in Hilbert Space Feb 13, 2020 Knowledge Distillation L2 Regularization
— Unverified 0Content Based Singing Voice Extraction From a Musical Mixture Feb 12, 2020 Decoder Deep Learning
Code Code Available 0Meta-Learning across Meta-Tasks for Few-Shot Learning Feb 11, 2020 Domain Adaptation Few-Shot Learning
— Unverified 0Regularized Evolutionary Population-Based Training Feb 11, 2020 Diversity image-classification
— Unverified 0Understanding and Improving Knowledge Distillation Feb 10, 2020 Knowledge Distillation Model Compression
— Unverified 0Unlabeled Data Deployment for Classification of Diabetic Retinopathy Images Using Knowledge Transfer Feb 9, 2020 General Classification Knowledge Distillation
— Unverified 0Feature-map-level Online Adversarial Knowledge Distillation Feb 5, 2020 Knowledge Distillation
— Unverified 0Periodic Intra-Ensemble Knowledge Distillation for Reinforcement Learning Feb 1, 2020 Knowledge Distillation MuJoCo
Code Code Available 0Search for Better Students to Learn Distilled Knowledge Jan 30, 2020 Knowledge Distillation Model Compression
— Unverified 0MSE-Optimal Neural Network Initialization via Layer Fusion Jan 28, 2020 General Classification Knowledge Distillation
Code Code Available 0Developing Multi-Task Recommendations with Long-Term Rewards via Policy Distilled Reinforcement Learning Jan 27, 2020 Deep Reinforcement Learning Knowledge Distillation
— Unverified 0Generation-Distillation for Efficient Natural Language Understanding in Low-Data Settings Jan 25, 2020 General Classification Knowledge Distillation
— Unverified 0Data Techniques For Online End-to-end Speech Recognition Jan 24, 2020 Data Augmentation Domain Adaptation
— Unverified 0Lightweight 3D Human Pose Estimation Network Training Using Teacher-Student Learning Jan 15, 2020 3D Human Pose Estimation 3D Pose Estimation
— Unverified 0A "Network Pruning Network" Approach to Deep Model Compression Jan 15, 2020 Knowledge Distillation Model Compression
— Unverified 0Uncertainty-Aware Multi-Shot Knowledge Distillation for Image-Based Object Re-Identification Jan 15, 2020 Knowledge Distillation Object
— Unverified 0Noisy Machines: Understanding Noisy Neural Networks and Enhancing Robustness to Analog Hardware Errors Using Distillation Jan 14, 2020 Knowledge Distillation
— Unverified 0AdaBERT: Task-Adaptive BERT Compression with Differentiable Neural Architecture Search Jan 13, 2020 Knowledge Distillation Neural Architecture Search
Code Code Available 0Learning Task-Agnostic Embedding of Multiple Black-Box Experts for Multi-Task Model Fusion Jan 1, 2020 Knowledge Distillation
— Unverified 0Modeling Teacher-Student Techniques in Deep Neural Networks for Knowledge Distillation Dec 31, 2019 Knowledge Distillation
— Unverified 0DeGAN : Data-Enriching GAN for Retrieving Representative Samples from a Trained Classifier Dec 27, 2019 Data-free Knowledge Distillation Incremental Learning
— Unverified 0Data-Free Adversarial Distillation Dec 23, 2019 Knowledge Distillation Model Compression
Code Code Available 0The State of Knowledge Distillation for Classification Dec 20, 2019 Classification Data Augmentation
Code Code Available 0Joint Architecture and Knowledge Distillation in CNN for Chinese Text Recognition Dec 17, 2019 Handwritten Chinese Text Recognition Knowledge Distillation
— Unverified 0Iterative Dual Domain Adaptation for Neural Machine Translation Dec 16, 2019 Domain Adaptation Knowledge Distillation
— Unverified 0Explaining Sequence-Level Knowledge Distillation as Data-Augmentation for Neural Machine Translation Dec 6, 2019 Data Augmentation Knowledge Distillation
— Unverified 0Acquiring Knowledge from Pre-trained Model to Neural Machine Translation Dec 4, 2019 General Knowledge Knowledge Distillation
— Unverified 0QUEST: Quantized embedding space for transferring knowledge Dec 3, 2019 Knowledge Distillation
Code Code Available 0Efficient Convolutional Neural Networks for Depth-Based Multi-Person Pose Estimation Dec 2, 2019 2D Pose Estimation Domain Adaptation
— Unverified 0