Revisiting Knowledge Distillation via Label Smoothing Regularization Sep 25, 2019 Knowledge Distillation Self-Knowledge Distillation
Code Code Available 0XD: Cross-lingual Knowledge Distillation for Polyglot Sentence Embeddings Sep 25, 2019 Knowledge Distillation Language Modeling
— Unverified 0Extremely Small BERT Models from Mixed-Vocabulary Training Sep 25, 2019 Knowledge Distillation Language Modelling
— Unverified 0Technical report on Conversational Question Answering Sep 24, 2019 Conversational Question Answering Data Augmentation
— Unverified 0FEED: Feature-level Ensemble for Knowledge Distillation Sep 24, 2019 Knowledge Distillation
— Unverified 0TinyBERT: Distilling BERT for Natural Language Understanding Sep 23, 2019 Knowledge Distillation Language Modelling
Code Code Available 0Learning Lightweight Pedestrian Detector with Hierarchical Knowledge Distillation Sep 20, 2019 Knowledge Distillation Pedestrian Detection
— Unverified 0Ensemble Knowledge Distillation for Learning Improved and Efficient Networks Sep 17, 2019 Ensemble Learning General Classification
Code Code Available 0Knowledge Transfer Graph for Deep Collaborative Learning Sep 10, 2019 Knowledge Distillation Transfer Learning
Code Code Available 0Accelerating Transformer Decoding via a Hybrid of Self-attention and Recurrent Neural Network Sep 5, 2019 Decoder Knowledge Distillation
— Unverified 0Knowledge distillation for optimization of quantized deep neural networks Sep 4, 2019 Knowledge Distillation
— Unverified 0Knowledge Distillation for End-to-End Person Search Sep 3, 2019 Knowledge Distillation Model Compression
Code Code Available 0Online Sensor Hallucination via Knowledge Distillation for Multimodal Image Classification Aug 28, 2019 Classification Decision Making
— Unverified 0Patient Knowledge Distillation for BERT Model Compression Aug 25, 2019 Knowledge Distillation model
Code Code Available 0Adversarial-Based Knowledge Distillation for Multi-Model Ensemble and Noisy Data Refinement Aug 22, 2019 Knowledge Distillation Missing Labels
— Unverified 0Language Graph Distillation for Low-Resource Machine Translation Aug 17, 2019 Knowledge Distillation Machine Translation
— Unverified 0Knowledge distillation for semi-supervised domain adaptation Aug 16, 2019 Domain Adaptation Knowledge Distillation
— Unverified 0Adaptive Regularization of Labels Aug 15, 2019 Data Augmentation Knowledge Distillation
— Unverified 0Scalable Attentive Sentence-Pair Modeling via Distilled Sentence Embedding Aug 14, 2019 Knowledge Distillation Natural Language Understanding
Code Code Available 0Effective Training of Convolutional Neural Networks with Low-bitwidth Weights and Activations Aug 10, 2019 Knowledge Distillation Quantization
— Unverified 0Knowledge Consistency between Neural Networks and Beyond Aug 5, 2019 Knowledge Distillation
— Unverified 0Learning Lightweight Lane Detection CNNs by Self Attention Distillation Aug 2, 2019 Knowledge Distillation Lane Detection
Code Code Available 0Self-Knowledge Distillation in Natural Language Processing Aug 2, 2019 Deep Learning Knowledge Distillation
— Unverified 0GTCOM Neural Machine Translation Systems for WMT19 Aug 1, 2019 Knowledge Distillation Language Modeling
— Unverified 0The NiuTrans Machine Translation Systems for WMT19 Aug 1, 2019 Knowledge Distillation Machine Translation
— Unverified 0Baidu Neural Machine Translation Systems for WMT19 Aug 1, 2019 Data Augmentation Domain Adaptation
— Unverified 0PANLP at MEDIQA 2019: Pre-trained Language Models, Transfer Learning and Knowledge Distillation Aug 1, 2019 Knowledge Distillation Re-Ranking
— Unverified 0Distill-to-Label: Weakly Supervised Instance Labeling Using Knowledge Distillation Jul 26, 2019 Breast Cancer Detection Instance Segmentation
— Unverified 0Distilled Siamese Networks for Visual Tracking Jul 24, 2019 Knowledge Distillation Object Tracking
— Unverified 0Highlight Every Step: Knowledge Distillation via Collaborative Teaching Jul 23, 2019 Knowledge Distillation
Code Code Available 0Real-Time Correlation Tracking via Joint Model Compression and Transfer Jul 23, 2019 Computational Efficiency CPU
Code Code Available 0Lifelong GAN: Continual Learning for Conditional Image Generation Jul 23, 2019 Conditional Image Generation Continual Learning
— Unverified 0Similarity-Preserving Knowledge Distillation Jul 23, 2019 Knowledge Distillation Neural Network Compression
— Unverified 0Light Multi-segment Activation for Model Compression Jul 16, 2019 Knowledge Distillation model
Code Code Available 0Learn Spelling from Teachers: Transferring Knowledge from Language Models to Sequence-to-Sequence Speech Recognition Jul 13, 2019 Knowledge Distillation Language Modeling
— Unverified 0BAM! Born-Again Multi-Task Networks for Natural Language Understanding Jul 10, 2019 Knowledge Distillation Natural Language Understanding
Code Code Available 0Graph-based Knowledge Distillation by Multi-head Attention Network Jul 4, 2019 Inductive Bias Knowledge Distillation
Code Code Available 0Compression of Acoustic Event Detection Models With Quantized Distillation Jul 1, 2019 Event Detection Knowledge Distillation
— Unverified 0Reconstructing Perceived Images from Brain Activity by Visually-guided Cognitive Representation and Adversarial Learning Jun 27, 2019 Generative Adversarial Network Image Reconstruction
— Unverified 0Essence Knowledge Distillation for Speech Recognition Jun 26, 2019 Knowledge Distillation speech-recognition
— Unverified 0Approximating Interactive Human Evaluation with Self-Play for Open-Domain Dialog Systems Jun 21, 2019 Dialogue Evaluation Knowledge Distillation
Code Code Available 0GAN-Knowledge Distillation for one-stage Object Detection Jun 20, 2019 Knowledge Distillation Object
— Unverified 0Membership Privacy for Machine Learning Models Through Knowledge Transfer Jun 15, 2019 BIG-bench Machine Learning General Classification
— Unverified 0Divide and Conquer: Leveraging Intermediate Feature Representations for Quantized Training of Neural Networks Jun 14, 2019 Knowledge Distillation Quantization
— Unverified 0Scalable Syntax-Aware Language Models Using Knowledge Distillation Jun 14, 2019 Knowledge Distillation Language Modeling
— Unverified 0Efficient Evaluation-Time Uncertainty Estimation by Improved Distillation Jun 12, 2019 Knowledge Distillation
— Unverified 0Incremental Classifier Learning Based on PEDCC-Loss and Cosine Distance Jun 11, 2019 Incremental Learning Knowledge Distillation
— Unverified 0Distilling Object Detectors with Fine-grained Feature Imitation Jun 9, 2019 Knowledge Distillation Object
Code Code Available 0Private Deep Learning with Teacher Ensembles Jun 5, 2019 Deep Learning Ensemble Learning
— Unverified 0Deep Face Recognition Model Compression via Knowledge Transfer and Distillation Jun 3, 2019 Face Recognition Knowledge Distillation
— Unverified 0