Adversarial-Based Knowledge Distillation for Multi-Model Ensemble and Noisy Data Refinement Aug 22, 2019 Knowledge Distillation Missing Labels
— Unverified 0Improved Techniques for Training Adaptive Deep Networks Aug 17, 2019 Computational Efficiency Knowledge Distillation
Code Code Available 1Language Graph Distillation for Low-Resource Machine Translation Aug 17, 2019 Knowledge Distillation Machine Translation
— Unverified 0Knowledge distillation for semi-supervised domain adaptation Aug 16, 2019 Domain Adaptation Knowledge Distillation
— Unverified 0Adaptive Regularization of Labels Aug 15, 2019 Data Augmentation Knowledge Distillation
— Unverified 0Scalable Attentive Sentence-Pair Modeling via Distilled Sentence Embedding Aug 14, 2019 Knowledge Distillation Natural Language Understanding
Code Code Available 0Effective Training of Convolutional Neural Networks with Low-bitwidth Weights and Activations Aug 10, 2019 Knowledge Distillation Quantization
— Unverified 0Knowledge Consistency between Neural Networks and Beyond Aug 5, 2019 Knowledge Distillation
— Unverified 0Self-Knowledge Distillation in Natural Language Processing Aug 2, 2019 Deep Learning Knowledge Distillation
— Unverified 0Learning Lightweight Lane Detection CNNs by Self Attention Distillation Aug 2, 2019 Knowledge Distillation Lane Detection
Code Code Available 0Baidu Neural Machine Translation Systems for WMT19 Aug 1, 2019 Data Augmentation Domain Adaptation
— Unverified 0The NiuTrans Machine Translation Systems for WMT19 Aug 1, 2019 Knowledge Distillation Machine Translation
— Unverified 0GTCOM Neural Machine Translation Systems for WMT19 Aug 1, 2019 Knowledge Distillation Language Modeling
— Unverified 0PANLP at MEDIQA 2019: Pre-trained Language Models, Transfer Learning and Knowledge Distillation Aug 1, 2019 Knowledge Distillation Re-Ranking
— Unverified 0Distill-to-Label: Weakly Supervised Instance Labeling Using Knowledge Distillation Jul 26, 2019 Breast Cancer Detection Instance Segmentation
— Unverified 0Distilled Siamese Networks for Visual Tracking Jul 24, 2019 Knowledge Distillation Object Tracking
— Unverified 0Lifelong GAN: Continual Learning for Conditional Image Generation Jul 23, 2019 Conditional Image Generation Continual Learning
— Unverified 0Real-Time Correlation Tracking via Joint Model Compression and Transfer Jul 23, 2019 Computational Efficiency CPU
Code Code Available 0Similarity-Preserving Knowledge Distillation Jul 23, 2019 Knowledge Distillation Neural Network Compression
— Unverified 0Highlight Every Step: Knowledge Distillation via Collaborative Teaching Jul 23, 2019 Knowledge Distillation
Code Code Available 0Light Multi-segment Activation for Model Compression Jul 16, 2019 Knowledge Distillation model
Code Code Available 0Learn Spelling from Teachers: Transferring Knowledge from Language Models to Sequence-to-Sequence Speech Recognition Jul 13, 2019 Knowledge Distillation Language Modeling
— Unverified 0BAM! Born-Again Multi-Task Networks for Natural Language Understanding Jul 10, 2019 Knowledge Distillation Natural Language Understanding
Code Code Available 0Graph-based Knowledge Distillation by Multi-head Attention Network Jul 4, 2019 Inductive Bias Knowledge Distillation
Code Code Available 0Compression of Acoustic Event Detection Models With Quantized Distillation Jul 1, 2019 Event Detection Knowledge Distillation
— Unverified 0Reconstructing Perceived Images from Brain Activity by Visually-guided Cognitive Representation and Adversarial Learning Jun 27, 2019 Generative Adversarial Network Image Reconstruction
— Unverified 0Essence Knowledge Distillation for Speech Recognition Jun 26, 2019 Knowledge Distillation speech-recognition
— Unverified 0Approximating Interactive Human Evaluation with Self-Play for Open-Domain Dialog Systems Jun 21, 2019 Dialogue Evaluation Knowledge Distillation
Code Code Available 0GAN-Knowledge Distillation for one-stage Object Detection Jun 20, 2019 Knowledge Distillation Object
— Unverified 0Membership Privacy for Machine Learning Models Through Knowledge Transfer Jun 15, 2019 BIG-bench Machine Learning General Classification
— Unverified 0Divide and Conquer: Leveraging Intermediate Feature Representations for Quantized Training of Neural Networks Jun 14, 2019 Knowledge Distillation Quantization
— Unverified 0Scalable Syntax-Aware Language Models Using Knowledge Distillation Jun 14, 2019 Knowledge Distillation Language Modeling
— Unverified 0Efficient Evaluation-Time Uncertainty Estimation by Improved Distillation Jun 12, 2019 Knowledge Distillation
— Unverified 0Incremental Classifier Learning Based on PEDCC-Loss and Cosine Distance Jun 11, 2019 Incremental Learning Knowledge Distillation
— Unverified 0Distilling Object Detectors with Fine-grained Feature Imitation Jun 9, 2019 Knowledge Distillation Object
Code Code Available 0When Does Label Smoothing Help? Jun 6, 2019 image-classification Image Classification
Code Code Available 1Private Deep Learning with Teacher Ensembles Jun 5, 2019 Deep Learning Ensemble Learning
— Unverified 0An Adaptive Random Path Selection Approach for Incremental Learning Jun 3, 2019 Incremental Learning Knowledge Distillation
Code Code Available 0Deep Face Recognition Model Compression via Knowledge Transfer and Distillation Jun 3, 2019 Face Recognition Knowledge Distillation
— Unverified 0On Knowledge distillation from complex networks for response prediction Jun 1, 2019 Knowledge Distillation Question Answering
— Unverified 0Online Distilling from Checkpoints for Neural Machine Translation Jun 1, 2019 Knowledge Distillation Machine Translation
— Unverified 0Knowledge Distillation via Instance Relationship Graph Jun 1, 2019 Knowledge Distillation
Code Code Available 0Structured Knowledge Distillation for Semantic Segmentation Jun 1, 2019 General Classification image-classification
Code Code Available 0SCAN: A Scalable Neural Networks Framework Towards Compact and Efficient Models May 27, 2019 Knowledge Distillation
Code Code Available 0Cross-Resolution Face Recognition via Prior-Aided Face Hallucination and Residual Knowledge Distillation May 26, 2019 Face Hallucination Face Recognition
— Unverified 0Adversarially Robust Distillation May 23, 2019 Adversarial Robustness Knowledge Distillation
Code Code Available 1Network Pruning via Transformable Architecture Search May 23, 2019 Knowledge Distillation Network Pruning
Code Code Available 0Zero-Shot Knowledge Distillation in Deep Networks May 20, 2019 Knowledge Distillation
Code Code Available 0Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation May 17, 2019 Knowledge Distillation
Code Code Available 0Creating Lightweight Object Detectors with Model Compression for Deployment on Edge Devices May 6, 2019 Knowledge Distillation Model Compression
— Unverified 0