Data-Efficient Ranking Distillation for Image Retrieval Jul 10, 2020 Image Retrieval Knowledge Distillation
— Unverified 0Knowledge Distillation Beyond Model Compression Jul 3, 2020 Knowledge Distillation model
— Unverified 0Interactive Knowledge Distillation Jul 3, 2020 image-classification Image Classification
— Unverified 0SimulSpeech: End-to-End Simultaneous Speech to Text Translation Jul 1, 2020 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Improving Autoregressive NMT with Non-Autoregressive Model Jul 1, 2020 Decoder de-en
— Unverified 0Xiaomi's Submissions for IWSLT 2020 Open Domain Translation Task Jul 1, 2020 Domain Adaptation Knowledge Distillation
— Unverified 0Exploring the Limits of Simple Learners in Knowledge Distillation for Document Classification with DocBERT Jul 1, 2020 Document Classification General Classification
— Unverified 0CASIA's System for IWSLT 2020 Open Domain Translation Jul 1, 2020 Knowledge Distillation Machine Translation
— Unverified 0Extracurricular Learning: Knowledge Transfer Beyond Empirical Distribution Jun 30, 2020 Image Classification Knowledge Distillation
— Unverified 0On the Demystification of Knowledge Distillation: A Residual Network Perspective Jun 30, 2020 Knowledge Distillation Model Compression
— Unverified 0Interpreting and Disentangling Feature Components of Various Complexity from DNNs Jun 29, 2020 Knowledge Distillation
Code Code Available 0Motion Pyramid Networks for Accurate and Efficient Cardiac Motion Estimation Jun 28, 2020 Knowledge Distillation Motion Estimation
— Unverified 0Diverse Knowledge Distillation (DKD): A Solution for Improving The Robustness of Ensemble Models Against Adversarial Attacks Jun 26, 2020 Ensemble Learning image-classification
— Unverified 0Streaming Transformer ASR with Blockwise Synchronous Inference Jun 25, 2020 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Distilling Object Detectors with Task Adaptive Regularization Jun 23, 2020 Knowledge Distillation Object
— Unverified 0Prior knowledge distillation based on financial time series Jun 16, 2020 Knowledge Distillation Time Series
— Unverified 0Multi-fidelity Neural Architecture Search with Knowledge Distillation Jun 15, 2020 Knowledge Distillation Neural Architecture Search
Code Code Available 0Pixel Invisibility: Detecting Objects Invisible in Color Images Jun 15, 2020 Knowledge Distillation object-detection
— Unverified 0Ensemble Distillation for Robust Model Fusion in Federated Learning Jun 12, 2020 BIG-bench Machine Learning Federated Learning
Code Code Available 0Knowledge Distillation: A Survey Jun 9, 2020 Knowledge Distillation Model Compression
— Unverified 0Continual Representation Learning for Biometric Identification Jun 8, 2020 Continual Learning Knowledge Distillation
Code Code Available 0Classification Under Misspecification: Halfspaces, Generalized Linear Models, and Connections to Evolvability Jun 8, 2020 Fairness General Classification
Code Code Available 0ResKD: Residual-Guided Knowledge Distillation Jun 8, 2020 Knowledge Distillation
— Unverified 0ADMP: An Adversarial Double Masks Based Pruning Framework For Unsupervised Cross-Domain Compression Jun 7, 2020 Domain Adaptation Knowledge Distillation
— Unverified 0An Empirical Analysis of the Impact of Data Augmentation on Knowledge Distillation Jun 6, 2020 Data Augmentation Knowledge Distillation
— Unverified 0An Overview of Neural Network Compression Jun 5, 2020 Knowledge Distillation Model Compression
— Unverified 0End-to-End Speech-Translation with Knowledge Distillation: FBK@IWSLT2020 Jun 4, 2020 Data Augmentation Knowledge Distillation
— Unverified 0Distilling Image Dehazing With Heterogeneous Task Imitation Jun 1, 2020 image-classification Image Classification
Code Code Available 0Apprentissage automatique de repr\'esentation de voix \`a l'aide d'une distillation de la connaissance pour le casting vocal (Learning voice representation using knowledge distillation for automatic voice casting ) Jun 1, 2020 Knowledge Distillation
— Unverified 0ADINet: Attribute Driven Incremental Network for Retinal Image Classification Jun 1, 2020 Attribute Classification
— Unverified 0Weight Squeezing: Reparameterization for Compression and Fast Inference May 30, 2020 Knowledge Distillation Model Compression
— Unverified 0Sub-Band Knowledge Distillation Framework for Speech Enhancement May 29, 2020 Knowledge Distillation Speech Enhancement
— Unverified 0Syntactic Structure Distillation Pretraining For Bidirectional Encoders May 27, 2020 Knowledge Distillation Language Modeling
— Unverified 0Why distillation helps: a statistical perspective May 21, 2020 Knowledge Distillation Retrieval
— Unverified 0Learning from a Lightweight Teacher for Efficient Knowledge Distillation May 19, 2020 Knowledge Distillation
— Unverified 0Joint Progressive Knowledge Distillation and Unsupervised Domain Adaptation May 16, 2020 Domain Adaptation Knowledge Distillation
Code Code Available 0Incremental Learning for End-to-End Automatic Speech Recognition May 11, 2020 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Distilling Knowledge from Pre-trained Language Models via Text Smoothing May 8, 2020 Knowledge Distillation Language Modeling
— Unverified 0Improving Non-autoregressive Neural Machine Translation with Monolingual Data May 2, 2020 Data Augmentation Knowledge Distillation
— Unverified 0Distilling Spikes: Knowledge Distillation in Spiking Neural Networks May 1, 2020 image-classification Image Classification
— Unverified 0General Purpose Text Embeddings from Pre-trained Language Models for Scalable Inference Apr 29, 2020 Knowledge Distillation Quantization
— Unverified 0LightPAFF: A Two-Stage Distillation Framework for Pre-training and Fine-tuning Apr 27, 2020 Knowledge Distillation Language Modeling
— Unverified 0A Tailored Pre-Training Model for Task-Oriented Dialog Generation Apr 24, 2020 Knowledge Distillation Language Modeling
Code Code Available 0A Study of Non-autoregressive Model for Sequence Generation Apr 22, 2020 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Knowledge Distillation for Multilingual Unsupervised Neural Machine Translation Apr 21, 2020 Decoder Knowledge Distillation
— Unverified 0Knowledge Distillation for Action Anticipation via Label Smoothing Apr 16, 2020 Action Anticipation Autonomous Driving
— Unverified 0Building a Multi-domain Neural Machine Translation Model using Knowledge Distillation Apr 15, 2020 Domain Adaptation Knowledge Distillation
— Unverified 0Towards Robust Classification with Image Quality Assessment Apr 14, 2020 Classification General Classification
— Unverified 0Smart Inference for Multidigit Convolutional Neural Network based Barcode Decoding Apr 14, 2020 Knowledge Distillation
— Unverified 0XtremeDistil: Multi-stage Distillation for Massive Multilingual Models Apr 12, 2020 Knowledge Distillation named-entity-recognition
— Unverified 0