FEED: Feature-level Ensemble Effect for knowledge Distillation May 1, 2019 Knowledge Distillation Transfer Learning
— Unverified 0Towards a better understanding of Vector Quantized Autoencoders May 1, 2019 Knowledge Distillation Machine Translation
— Unverified 0Semi-supervised Acoustic Event Detection based on tri-training Apr 29, 2019 Event Detection Knowledge Distillation
— Unverified 0Segmenting the Future Apr 24, 2019 Autonomous Driving Decision Making
Code Code Available 0TextKD-GAN: Text Generation using KnowledgeDistillation and Generative Adversarial Networks Apr 23, 2019 Image Generation Knowledge Distillation
Code Code Available 0Model Compression with Multi-Task Knowledge Distillation for Web-scale Question Answering System Apr 21, 2019 Knowledge Distillation Model Compression
— Unverified 0Improving Multi-Task Deep Neural Networks via Knowledge Distillation for Natural Language Understanding Apr 20, 2019 Ensemble Learning Knowledge Distillation
— Unverified 0Knowledge Distillation via Route Constrained Optimization Apr 19, 2019 Face Recognition Knowledge Distillation
Code Code Available 1Feature Fusion for Online Mutual Knowledge Distillation Apr 19, 2019 Knowledge Distillation
Code Code Available 0Guiding CTC Posterior Spike Timings for Improved Posterior Fusion and Knowledge Distillation Apr 17, 2019 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0End-to-End Speech Translation with Knowledge Distillation Apr 17, 2019 Knowledge Distillation speech-recognition
— Unverified 0Visual Relationship Detection with Language prior and Softmax Apr 16, 2019 Knowledge Distillation Relationship Detection
Code Code Available 0Automatic adaptation of object detectors to new domains using self-training Apr 15, 2019 Domain Adaptation Knowledge Distillation
Code Code Available 0Examining the Mapping Functions of Denoising Autoencoders in Singing Voice Separation Apr 12, 2019 Decoder Denoising
— Unverified 0Unifying Heterogeneous Classifiers with Distillation Apr 12, 2019 Knowledge Distillation
Code Code Available 0Improved training of binary networks for human pose estimation and image recognition Apr 11, 2019 Binarization Classification with Binary Neural Network
— Unverified 0Variational Information Distillation for Knowledge Transfer Apr 11, 2019 Knowledge Distillation Transfer Learning
— Unverified 0Knowledge Squeezed Adversarial Network Compression Apr 10, 2019 Knowledge Distillation Transfer Learning
— Unverified 0Spatiotemporal Knowledge Distillation for Efficient Estimation of Aerial Video Saliency Apr 10, 2019 GPU Knowledge Distillation
— Unverified 0Relational Knowledge Distillation Apr 10, 2019 Knowledge Distillation Metric Learning
Code Code Available 0Ultrafast Video Attention Prediction with Coupled Knowledge Distillation Apr 9, 2019 CPU GPU
— Unverified 0Knowledge Distillation for Human Action Anticipation Apr 9, 2019 Action Anticipation Action Recognition
— Unverified 0Knowledge Distillation For Recurrent Neural Network Language Modeling With Trust Regularization Apr 8, 2019 Knowledge Distillation Language Modeling
— Unverified 0Long-Term Vehicle Localization by Recursive Knowledge Distillation Apr 7, 2019 Domain Adaptation Ensemble Learning
— Unverified 0Token-Level Ensemble Distillation for Grapheme-to-Phoneme Conversion Apr 6, 2019 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0M2KD: Multi-model and Multi-level Knowledge Distillation for Incremental Learning Apr 3, 2019 Incremental Learning Knowledge Distillation
— Unverified 0Correlation Congruence for Knowledge Distillation Apr 3, 2019 Face Recognition image-classification
Code Code Available 0A Comprehensive Overhaul of Feature Distillation Apr 3, 2019 General Classification image-classification
Code Code Available 0Making Neural Machine Reading Comprehension Faster Mar 29, 2019 Knowledge Distillation Machine Reading Comprehension
— Unverified 0Improving Route Choice Models by Incorporating Contextual Factors via Knowledge Distillation Mar 27, 2019 Knowledge Distillation Management
— Unverified 0Improving Neural Architecture Search Image Classifiers via Ensemble Learning Mar 14, 2019 Ensemble Learning Image Classification
Code Code Available 0Rectified Decision Trees: Towards Interpretability, Compression and Empirical Soundness Mar 14, 2019 Knowledge Distillation
— Unverified 0Knowledge Adaptation for Efficient Semantic Segmentation Mar 12, 2019 Knowledge Distillation Segmentation
— Unverified 0Structured Knowledge Distillation for Dense Prediction Mar 11, 2019 Depth Estimation General Classification
Code Code Available 0Refine and Distill: Exploiting Cycle-Inconsistency and Knowledge Distillation for Unsupervised Monocular Depth Estimation Mar 11, 2019 Depth Estimation Depth Prediction
— Unverified 0SeizureNet: Multi-Spectral Deep Feature Learning for Seizure Type Classification Mar 8, 2019 Classification EEG
Code Code Available 0TKD: Temporal Knowledge Distillation for Active Perception Mar 4, 2019 Knowledge Distillation Object
— Unverified 0Multilingual Neural Machine Translation with Knowledge Distillation Feb 27, 2019 Diversity Knowledge Distillation
Code Code Available 0Improved Knowledge Distillation via Teacher Assistant Feb 9, 2019 Knowledge Distillation
Code Code Available 0MICIK: MIning Cross-Layer Inherent Similarity Knowledge for Deep Model Compression Feb 3, 2019 Knowledge Distillation Model Compression
— Unverified 0Compressing GANs using Knowledge Distillation Feb 1, 2019 Knowledge Distillation Super-Resolution
— Unverified 0Progressive Label Distillation: Learning Input-Efficient Deep Neural Networks Jan 26, 2019 Knowledge Distillation speech-recognition
— Unverified 0Unsupervised Learning of Neural Networks to Explain Neural Networks (extended abstract) Jan 21, 2019 Knowledge Distillation Object
— Unverified 0Learning Efficient Detector with Semi-supervised Adaptive Distillation Jan 2, 2019 image-classification Image Classification
Code Code Available 0Stealing Neural Networks via Timing Side Channels Dec 31, 2018 Knowledge Distillation Reinforcement Learning
— Unverified 0Improving the Interpretability of Deep Neural Networks with Knowledge Distillation Dec 28, 2018 Ethics Knowledge Distillation
— Unverified 0Learning Student Networks via Feature Embedding Dec 17, 2018 Knowledge Distillation
— Unverified 0Spatial Knowledge Distillation to aid Visual Reasoning Dec 10, 2018 Diagnostic Knowledge Distillation
— Unverified 0Optimizing speed/accuracy trade-off for person re-identification via knowledge distillation Dec 7, 2018 Deep Learning General Classification
— Unverified 0An Embarrassingly Simple Approach for Knowledge Distillation Dec 5, 2018 Face Recognition Knowledge Distillation
Code Code Available 0