End-to-End Speech-Translation with Knowledge Distillation: FBK@IWSLT2020 Jun 4, 2020 Data Augmentation Knowledge Distillation
— Unverified 0Channel Distillation: Channel-Wise Attention for Knowledge Distillation Jun 2, 2020 Knowledge Distillation
Code Code Available 1Apprentissage automatique de repr\'esentation de voix \`a l'aide d'une distillation de la connaissance pour le casting vocal (Learning voice representation using knowledge distillation for automatic voice casting ) Jun 1, 2020 Knowledge Distillation
— Unverified 0Online Knowledge Distillation via Collaborative Learning Jun 1, 2020 Knowledge Distillation Model Compression
Code Code Available 1ADINet: Attribute Driven Incremental Network for Retinal Image Classification Jun 1, 2020 Attribute Classification
— Unverified 0Distilling Image Dehazing With Heterogeneous Task Imitation Jun 1, 2020 image-classification Image Classification
Code Code Available 0Distilling Cross-Task Knowledge via Relationship Matching Jun 1, 2020 Knowledge Distillation
Code Code Available 1Block-Wisely Supervised Neural Architecture Search With Knowledge Distillation Jun 1, 2020 Knowledge Distillation Neural Architecture Search
Code Code Available 1Transferring Inductive Biases through Knowledge Distillation May 31, 2020 Knowledge Distillation
Code Code Available 1Weight Squeezing: Reparameterization for Compression and Fast Inference May 30, 2020 Knowledge Distillation Model Compression
— Unverified 0Sub-Band Knowledge Distillation Framework for Speech Enhancement May 29, 2020 Knowledge Distillation Speech Enhancement
— Unverified 0Syntactic Structure Distillation Pretraining For Bidirectional Encoders May 27, 2020 Knowledge Distillation Language Modeling
— Unverified 0Why distillation helps: a statistical perspective May 21, 2020 Knowledge Distillation Retrieval
— Unverified 0Distilling Knowledge from Ensembles of Acoustic Models for Joint CTC-Attention End-to-End Speech Recognition May 19, 2020 Automatic Speech Recognition Automatic Speech Recognition (ASR)
Code Code Available 1Learning from a Lightweight Teacher for Efficient Knowledge Distillation May 19, 2020 Knowledge Distillation
— Unverified 0MicroNet for Efficient Language Modeling May 16, 2020 Knowledge Distillation Language Modeling
Code Code Available 1Joint Progressive Knowledge Distillation and Unsupervised Domain Adaptation May 16, 2020 Domain Adaptation Knowledge Distillation
Code Code Available 0Incremental Learning for End-to-End Automatic Speech Recognition May 11, 2020 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Data-Free Network Quantization With Adversarial Knowledge Distillation May 8, 2020 Knowledge Distillation Model Compression
Code Code Available 1Distilling Knowledge from Pre-trained Language Models via Text Smoothing May 8, 2020 Knowledge Distillation Language Modeling
— Unverified 0ProSelfLC: Progressive Self Label Correction for Training Robust Deep Neural Networks May 7, 2020 Knowledge Distillation Self-Knowledge Distillation
Code Code Available 1MAZE: Data-Free Model Stealing Attack Using Zeroth-Order Gradient Estimation May 6, 2020 Data-free Knowledge Distillation Knowledge Distillation
Code Code Available 1Heterogeneous Knowledge Distillation using Information Flow Modeling May 2, 2020 Knowledge Distillation
Code Code Available 1Improving Non-autoregressive Neural Machine Translation with Monolingual Data May 2, 2020 Data Augmentation Knowledge Distillation
— Unverified 0Distilling Spikes: Knowledge Distillation in Spiking Neural Networks May 1, 2020 image-classification Image Classification
— Unverified 0Language Model Prior for Low-Resource Neural Machine Translation Apr 30, 2020 Knowledge Distillation Language Modeling
Code Code Available 1General Purpose Text Embeddings from Pre-trained Language Models for Scalable Inference Apr 29, 2020 Knowledge Distillation Quantization
— Unverified 0LightPAFF: A Two-Stage Distillation Framework for Pre-training and Fine-tuning Apr 27, 2020 Knowledge Distillation Language Modeling
— Unverified 0A Tailored Pre-Training Model for Task-Oriented Dialog Generation Apr 24, 2020 Knowledge Distillation Language Modeling
Code Code Available 0Distilling Knowledge from Refinement in Multiple Instance Detection Networks Apr 23, 2020 Knowledge Distillation Multiple Instance Learning
Code Code Available 1A Study of Non-autoregressive Model for Sequence Generation Apr 22, 2020 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Making Monolingual Sentence Embeddings Multilingual using Knowledge Distillation Apr 21, 2020 Knowledge Distillation Sentence
Code Code Available 1Knowledge Distillation for Multilingual Unsupervised Neural Machine Translation Apr 21, 2020 Decoder Knowledge Distillation
— Unverified 0Role-Wise Data Augmentation for Knowledge Distillation Apr 19, 2020 Data Augmentation Knowledge Distillation
Code Code Available 1Triplet Loss for Knowledge Distillation Apr 17, 2020 Knowledge Distillation Metric Learning
Code Code Available 1Multimodal and multiview distillation for real-time player detection on a football field Apr 16, 2020 Data Augmentation Knowledge Distillation
Code Code Available 1Knowledge Distillation for Action Anticipation via Label Smoothing Apr 16, 2020 Action Anticipation Autonomous Driving
— Unverified 0Dark Experience for General Continual Learning: a Strong, Simple Baseline Apr 15, 2020 class-incremental learning Class Incremental Learning
Code Code Available 1Building a Multi-domain Neural Machine Translation Model using Knowledge Distillation Apr 15, 2020 Domain Adaptation Knowledge Distillation
— Unverified 0Smart Inference for Multidigit Convolutional Neural Network based Barcode Decoding Apr 14, 2020 Knowledge Distillation
— Unverified 0Towards Robust Classification with Image Quality Assessment Apr 14, 2020 Classification General Classification
— Unverified 0Knowledge Distillation and Student-Teacher Learning for Visual Intelligence: A Review and New Outlooks Apr 13, 2020 Knowledge Distillation Model Compression
Code Code Available 2XtremeDistil: Multi-stage Distillation for Massive Multilingual Models Apr 12, 2020 Knowledge Distillation named-entity-recognition
— Unverified 0KD-MRI: A knowledge distillation framework for image reconstruction and image restoration in MRI workflow Apr 11, 2020 CPU GPU
Code Code Available 1Inter-Region Affinity Distillation for Road Marking Segmentation Apr 11, 2020 Knowledge Distillation Lane Detection
Code Code Available 1Knowledge Distillation for Mobile Edge Computation Offloading Apr 9, 2020 Imitation Learning Knowledge Distillation
— Unverified 0On the Effect of Dropping Layers of Pre-trained Transformer Models Apr 8, 2020 Knowledge Distillation Sentence
Code Code Available 1LadaBERT: Lightweight Adaptation of BERT through Hybrid Model Compression Apr 8, 2020 Blocking Knowledge Distillation
— Unverified 0Structure-Level Knowledge Distillation For Multilingual Sequence Labeling Apr 8, 2020 Aspect Extraction Knowledge Distillation
Code Code Available 1Towards Efficient Unconstrained Palmprint Recognition via Deep Distillation Hashing Apr 7, 2020 Knowledge Distillation
Code Code Available 1