Random Path Selection for Continual Learning Dec 1, 2019 Continual Learning Incremental Learning
Code Code Available 0Knowledge Extraction with No Observable Data Dec 1, 2019 Data-free Knowledge Distillation Knowledge Distillation
Code Code Available 0Online Knowledge Distillation with Diverse Peers Dec 1, 2019 Knowledge Distillation Transfer Learning
Code Code Available 0Towards Oracle Knowledge Distillation with Neural Architecture Search Nov 29, 2019 image-classification Image Classification
— Unverified 0Distributed Soft Actor-Critic with Multivariate Reward Representation and Knowledge Distillation Nov 29, 2019 Knowledge Distillation reinforcement-learning
Code Code Available 0QKD: Quantization-aware Knowledge Distillation Nov 28, 2019 Knowledge Distillation Quantization
— Unverified 0Data-Driven Compression of Convolutional Neural Networks Nov 28, 2019 Knowledge Distillation Model Compression
— Unverified 0Hearing Lips: Improving Lip Reading by Distilling Speech Recognizers Nov 26, 2019 Knowledge Distillation Lipreading
— Unverified 0Few Shot Network Compression via Cross Distillation Nov 21, 2019 Knowledge Distillation Model Compression
Code Code Available 0Search to Distill: Pearls are Everywhere but not the Eyes Nov 20, 2019 Ensemble Learning Face Recognition
— Unverified 0Neural Network Pruning with Residual-Connections and Limited-Data Nov 19, 2019 Knowledge Distillation Network Pruning
Code Code Available 0Towards Making Deep Transfer Learning Never Hurt Nov 18, 2019 All Knowledge Distillation
— Unverified 0Data Efficient Stagewise Knowledge Distillation Nov 15, 2019 Knowledge Distillation Model Compression
Code Code Available 0Collaborative Distillation for Top-N Recommendation Nov 13, 2019 Collaborative Filtering Knowledge Distillation
— Unverified 0Knowledge Representing: Efficient, Sparse Representation of Prior Knowledge for Knowledge Distillation Nov 13, 2019 Image Classification Knowledge Distillation
— Unverified 0Graph Representation Learning via Multi-task Knowledge Distillation Nov 11, 2019 Graph Representation Learning Knowledge Distillation
— Unverified 0Knowledge Distillation in Document Retrieval Nov 11, 2019 Knowledge Distillation Retrieval
— Unverified 0MKD: a Multi-Task Knowledge Distillation Approach for Pretrained Language Models Nov 9, 2019 Knowledge Distillation Multi-Task Learning
— Unverified 0Knowledge Distillation for Incremental Learning in Semantic Segmentation Nov 8, 2019 image-classification Image Classification
— Unverified 0Deep geometric knowledge distillation with graphs Nov 8, 2019 Knowledge Distillation
Code Code Available 0Teacher-Student Training for Robust Tacotron-based TTS Nov 7, 2019 Decoder Knowledge Distillation
— Unverified 0Understanding Knowledge Distillation in Non-autoregressive Machine Translation Nov 7, 2019 Knowledge Distillation Machine Translation
— Unverified 0Microsoft Research Asia's Systems for WMT19 Nov 7, 2019 Data Augmentation Knowledge Distillation
— Unverified 0Weakly Supervised Cross-lingual Semantic Relation Classification via Knowledge Distillation Nov 1, 2019 Classification Cross-Lingual Transfer
— Unverified 0ESPnet How2 Speech Translation System for IWSLT 2019: Pre-training, Knowledge Distillation, and Going Deeper Nov 1, 2019 All Knowledge Distillation
— Unverified 0Natural Language Generation for Effective Knowledge Distillation Nov 1, 2019 Knowledge Distillation Linguistic Acceptability
Code Code Available 0Distilling Pixel-Wise Feature Similarities for Semantic Segmentation Oct 31, 2019 Knowledge Distillation Neural Network Compression
— Unverified 0A Simple but Effective BERT Model for Dialog State Tracking on Resource-Limited Systems Oct 28, 2019 dialog state tracking Dialogue State Tracking
— Unverified 0MOD: A Deep Mixture Model with Online Knowledge Distillation for Large Scale Video Temporal Concept Localization Oct 27, 2019 Knowledge Distillation Video Understanding
Code Code Available 0Variational Student: Learning Compact and Sparser Networks in Knowledge Distillation Framework Oct 26, 2019 Knowledge Distillation Variational Inference
— Unverified 0Secost: Sequential co-supervision for large scale weakly labeled audio event detection Oct 25, 2019 Event Detection Knowledge Distillation
— Unverified 0An Empirical Study of Efficient ASR Rescoring with Transformers Oct 24, 2019 Knowledge Distillation Language Modeling
— Unverified 0Adversarial Feature Alignment: Avoid Catastrophic Forgetting in Incremental Task Lifelong Learning Oct 24, 2019 Continual Learning image-classification
— Unverified 0Model Compression with Two-stage Multi-teacher Knowledge Distillation for Web Question Answering System Oct 18, 2019 General Knowledge Knowledge Distillation
— Unverified 0A Generalized and Robust Method Towards Practical Gaze Estimation on Smart Phone Oct 16, 2019 Gaze Estimation Knowledge Distillation
— Unverified 0VarGFaceNet: An Efficient Variable Group Convolutional Neural Network for Lightweight Face Recognition Oct 11, 2019 Face Detection Face Identification
Code Code Available 0Noise as a Resource for Learning in Knowledge Distillation Oct 11, 2019 Knowledge Distillation
— Unverified 0Cross-modal knowledge distillation for action recognition Oct 10, 2019 Action Recognition Knowledge Distillation
— Unverified 0Knowledge Distillation from Internal Representations Oct 8, 2019 Knowledge Distillation
— Unverified 0Distilling BERT into Simple Neural Networks with Unlabeled Transfer Data Oct 4, 2019 Knowledge Distillation NER
— Unverified 0On the Efficacy of Knowledge Distillation Oct 3, 2019 Knowledge Distillation
— Unverified 0AntMan: Sparse Low-Rank Compression to Accelerate RNN inference Oct 2, 2019 Knowledge Distillation Low-rank compression
— Unverified 0Improving Word Embedding Factorization for Compression Using Distilled Nonlinear Neural Decomposition Oct 2, 2019 Knowledge Distillation Language Modeling
— Unverified 0A Bayesian Optimization Framework for Neural Network Compression Oct 1, 2019 Bayesian Optimization Knowledge Distillation
— Unverified 0Training convolutional neural networks with cheap convolutions and online distillation Sep 28, 2019 Knowledge Distillation
Code Code Available 0Compact Trilinear Interaction for Visual Question Answering Sep 26, 2019 Benchmarking Knowledge Distillation
Code Code Available 0Proactive Sequence Generator via Knowledge Acquisition Sep 25, 2019 de-en Knowledge Distillation
— Unverified 0SELF-KNOWLEDGE DISTILLATION ADVERSARIAL ATTACK Sep 25, 2019 Adversarial Attack Knowledge Distillation
— Unverified 0Distilled embedding: non-linear embedding factorization using knowledge distillation Sep 25, 2019 Knowledge Distillation Machine Translation
— Unverified 0Collaborative Inter-agent Knowledge Distillation for Reinforcement Learning Sep 25, 2019 Decision Making Knowledge Distillation
— Unverified 0