Towards Non-task-specific Distillation of BERT via Sentence Representation Approximation Apr 7, 2020 Knowledge Distillation Sentence
— Unverified 0Enhancing Review Comprehension with Domain-Specific Commonsense Apr 6, 2020 Aspect Extraction Knowledge Distillation
— Unverified 0Temporally Distributed Networks for Fast Video Semantic Segmentation Apr 3, 2020 Knowledge Distillation Real-Time Semantic Segmentation
Code Code Available 1More Grounded Image Captioning by Distilling Image-Text Matching Model Apr 1, 2020 Image Captioning Image-text matching
Code Code Available 1Knowledge as Priors: Cross-Modal Knowledge Generalization for Datasets without Superior Knowledge Apr 1, 2020 3D Hand Pose Estimation Hand Pose Estimation
— Unverified 0Creating Something from Nothing: Unsupervised Knowledge Distillation for Cross-Modal Hashing Apr 1, 2020 Knowledge Distillation Retrieval
Code Code Available 1Neural Networks Are More Productive Teachers Than Human Raters: Active Mixup for Data-Efficient Knowledge Distillation from a Blackbox Model Mar 31, 2020 Active Learning Knowledge Distillation
Code Code Available 1Distilled Semantics for Comprehensive Scene Understanding from Videos Mar 31, 2020 Depth Estimation Knowledge Distillation
Code Code Available 1Spatio-Temporal Graph for Video Captioning with Knowledge Distillation Mar 31, 2020 Knowledge Distillation Object
— Unverified 0SS-IL: Separated Softmax for Incremental Learning Mar 31, 2020 class-incremental learning Class Incremental Learning
— Unverified 0Regularizing Class-wise Predictions via Self-knowledge Distillation Mar 31, 2020 image-classification Image Classification
Code Code Available 1Squeezed Deep 6DoF Object Detection Using Knowledge Distillation Mar 30, 2020 Knowledge Distillation Object
Code Code Available 0Analysis of Knowledge Transfer in Kernel Regime Mar 30, 2020 Knowledge Distillation Transfer Learning
— Unverified 0Circumventing Outliers of AutoAugment with Knowledge Distillation Mar 25, 2020 Data Augmentation General Classification
Code Code Available 1A Survey of Methods for Low-Power Deep Learning and Computer Vision Mar 24, 2020 Knowledge Distillation Quantization
— Unverified 0Synergic Adversarial Label Learning for Grading Retinal Diseases via Knowledge Distillation and Multi-task Learning Mar 24, 2020 Classification General Classification
— Unverified 0Distilling Knowledge from Graph Convolutional Networks Mar 23, 2020 Knowledge Distillation Transfer Learning
Code Code Available 1Collaborative Distillation for Ultra-Resolution Universal Style Transfer Mar 18, 2020 Decoder GPU
Code Code Available 1Incremental Object Detection via Meta-Learning Mar 17, 2020 Incremental Learning Knowledge Distillation
Code Code Available 1Teacher-Student chain for efficient semi-supervised histology image classification Mar 17, 2020 Classification General Classification
— Unverified 0Deformation Flow Based Two-Stream Network for Lip Reading Mar 12, 2020 Knowledge Distillation Lipreading
Code Code Available 1SuperMix: Supervising the Mixing Data Augmentation Mar 10, 2020 Data Augmentation General Classification
Code Code Available 1Knowledge distillation via adaptive instance normalization Mar 9, 2020 Knowledge Distillation Model Compression
— Unverified 0Faster ILOD: Incremental Learning for Object Detectors based on Faster RCNN Mar 9, 2020 Incremental Learning Knowledge Distillation
Code Code Available 1Pacemaker: Intermediate Teacher Knowledge Distillation For On-The-Fly Convolutional Neural Network Mar 9, 2020 Knowledge Distillation Model Compression
— Unverified 0PoseNet3D: Learning Temporally Consistent 3D Human Pose via Knowledge Distillation Mar 7, 2020 3D Human Pose Estimation Knowledge Distillation
Code Code Available 1Distilling portable Generative Adversarial Networks for Image Translation Mar 7, 2020 Image-to-Image Translation Knowledge Distillation
— Unverified 0Explaining Knowledge Distillation by Quantifying the Knowledge Mar 7, 2020 Knowledge Distillation
— Unverified 0Distill, Adapt, Distill: Training Small, In-Domain Models for Neural Machine Translation Mar 5, 2020 Domain Adaptation Knowledge Distillation
— Unverified 0An Efficient Method of Training Small Models for Regression Problems with Knowledge Distillation Feb 28, 2020 Knowledge Distillation Memorization
— Unverified 0TextBrewer: An Open-Source Knowledge Distillation Toolkit for Natural Language Processing Feb 28, 2020 Knowledge Distillation Reading Comprehension
Code Code Available 2Efficient Semantic Video Segmentation with Per-frame Inference Feb 26, 2020 Knowledge Distillation Optical Flow Estimation
Code Code Available 1Semi-Supervised Speech Recognition via Local Prior Matching Feb 24, 2020 Knowledge Distillation Language Modeling
Code Code Available 3Residual Knowledge Distillation Feb 21, 2020 Knowledge Distillation Model Compression
— Unverified 0Balancing Cost and Benefit with Tied-Multi Transformers Feb 20, 2020 Decoder Knowledge Distillation
— Unverified 0The Microsoft Toolkit of Multi-Task Deep Neural Networks for Natural Language Understanding Feb 19, 2020 Knowledge Distillation Multi-Task Learning
— Unverified 0Knapsack Pruning with Inner Distillation Feb 19, 2020 GPU Knowledge Distillation
Code Code Available 1Self-Distillation Amplifies Regularization in Hilbert Space Feb 13, 2020 Knowledge Distillation L2 Regularization
— Unverified 0Salvaging Federated Learning by Local Adaptation Feb 12, 2020 Federated Learning Knowledge Distillation
Code Code Available 1Content Based Singing Voice Extraction From a Musical Mixture Feb 12, 2020 Decoder Deep Learning
Code Code Available 0Meta-Learning across Meta-Tasks for Few-Shot Learning Feb 11, 2020 Domain Adaptation Few-Shot Learning
— Unverified 0Regularized Evolutionary Population-Based Training Feb 11, 2020 Diversity image-classification
— Unverified 0Knowledge Distillation for Brain Tumor Segmentation Feb 10, 2020 Brain Tumor Segmentation Knowledge Distillation
Code Code Available 1Understanding and Improving Knowledge Distillation Feb 10, 2020 Knowledge Distillation Model Compression
— Unverified 0Unlabeled Data Deployment for Classification of Diabetic Retinopathy Images Using Knowledge Transfer Feb 9, 2020 General Classification Knowledge Distillation
— Unverified 0SUOD: Toward Scalable Unsupervised Outlier Detection Feb 8, 2020 Knowledge Distillation Outlier Detection
Code Code Available 1BERT-of-Theseus: Compressing BERT by Progressive Module Replacing Feb 7, 2020 Knowledge Distillation Model Compression
Code Code Available 1Feature-map-level Online Adversarial Knowledge Distillation Feb 5, 2020 Knowledge Distillation
— Unverified 0Periodic Intra-Ensemble Knowledge Distillation for Reinforcement Learning Feb 1, 2020 Knowledge Distillation MuJoCo
Code Code Available 0Search for Better Students to Learn Distilled Knowledge Jan 30, 2020 Knowledge Distillation Model Compression
— Unverified 0