Large-Scale Generative Data-Free Distillation Dec 10, 2020 Knowledge Distillation Model Compression
— Unverified 0On Knowledge Distillation for Direct Speech Translation Dec 9, 2020 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Model Compression Using Optimal Transport Dec 7, 2020 image-classification Image Classification
— Unverified 0Parallel Blockwise Knowledge Distillation for Deep Neural Network Compression Dec 5, 2020 Knowledge Distillation Neural Network Compression
Code Code Available 0Reciprocal Supervised Learning Improves Neural Machine Translation Dec 5, 2020 image-classification Image Classification
Code Code Available 0Multi-head Knowledge Distillation for Model Compression Dec 5, 2020 image-classification Image Classification
— Unverified 0Meta-KD: A Meta Knowledge Distillation Framework for Language Model Compression across Domains Dec 2, 2020 Knowledge Distillation Language Modeling
— Unverified 0Self-Supervised Generative Adversarial Compression Dec 1, 2020 image-classification Image Classification
— Unverified 0Solvable Model for Inheriting the Regularization through Knowledge Distillation Dec 1, 2020 Knowledge Distillation Transfer Learning
— Unverified 0Query Distillation: BERT-based Distillation for Ensemble Ranking Dec 1, 2020 Knowledge Distillation
— Unverified 0Classification Under Misspecification: Halfspaces, Generalized Linear Models, and Evolvability Dec 1, 2020 Classification Fairness
— Unverified 0Reverse-engineering recurrent neural network solutions to a hierarchical inference task for mice Dec 1, 2020 Knowledge Distillation Model Compression
— Unverified 0A Selective Survey on Versatile Knowledge Distillation Paradigm for Neural Network Models Nov 30, 2020 Knowledge Distillation Model Compression
— Unverified 0Real-time Spatio-temporal Action Localization via Learning Motion Representation Nov 30, 2020 Action Classification Action Localization
— Unverified 0Adaptive Multiplane Image Generation from a Single Internet Picture Nov 26, 2020 Depth Estimation Image Generation
— Unverified 0torchdistill: A Modular, Configuration-Driven Framework for Knowledge Distillation Nov 25, 2020 Image Classification Instance Segmentation
— Unverified 0Generative Adversarial Simulator Nov 23, 2020 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 0MixMix: All You Need for Data-Free Compression Are Feature and Data Mixing Nov 19, 2020 All Knowledge Distillation
— Unverified 0A Knowledge Distillation Ensemble Framework for Predicting Short and Long-term Hospitalisation Outcomes from Electronic Health Records Data Nov 18, 2020 Decision Making ICU Admission
Code Code Available 0Privileged Knowledge Distillation for Online Action Detection Nov 18, 2020 Action Detection Knowledge Distillation
— Unverified 0Effectiveness of Arbitrary Transfer Sets for Data-free Knowledge Distillation Nov 18, 2020 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 0Generalized Continual Zero-Shot Learning Nov 17, 2020 Continual Learning Knowledge Distillation
— Unverified 0Deep Serial Number: Computational Watermarking for DNN Intellectual Property Protection Nov 17, 2020 Knowledge Distillation valid
— Unverified 0Digging Deeper into CRNN Model in Chinese Text Images Recognition Nov 17, 2020 Denoising Knowledge Distillation
— Unverified 0Online Ensemble Model Compression using Knowledge Distillation Nov 15, 2020 Knowledge Distillation model
Code Code Available 0Real-Time Decentralized knowledge Transfer at the Edge Nov 11, 2020 Knowledge Distillation Transfer Learning
Code Code Available 0EGAD: Evolving Graph Representation Learning with Self-Attention and Knowledge Distillation for Live Video Streaming Events Nov 11, 2020 Graph Representation Learning Knowledge Distillation
Code Code Available 0Distill2Vec: Dynamic Graph Representation Learning with Knowledge Distillation Nov 11, 2020 Graph Representation Learning Knowledge Distillation
Code Code Available 0On Estimating the Training Cost of Conversational Recommendation Systems Nov 10, 2020 Conversational Recommendation Knowledge Distillation
— Unverified 0Knowledge Distillation for Singing Voice Detection Nov 9, 2020 Information Retrieval Knowledge Distillation
Code Code Available 0Ensemble Knowledge Distillation for CTR Prediction Nov 8, 2020 Click-Through Rate Prediction Knowledge Distillation
— Unverified 0Robustness and Diversity Seeking Data-Free Knowledge Distillation Nov 7, 2020 Data-free Knowledge Distillation Diversity
Code Code Available 0Human-Like Active Learning: Machines Simulating the Human Learning Process Nov 7, 2020 Active Learning Form
— Unverified 0Channel Planting for Deep Neural Networks using Knowledge Distillation Nov 4, 2020 Knowledge Distillation Network Pruning
— Unverified 0On Self-Distilling Graph Neural Network Nov 4, 2020 Graph Embedding Graph Neural Network
— Unverified 0Paralinguistic Privacy Protection at the Edge Nov 4, 2020 CPU Knowledge Distillation
— Unverified 0A Comprehensive Study of Class Incremental Learning Algorithms for Visual Tasks Nov 3, 2020 class-incremental learning Class Incremental Learning
— Unverified 0Distilling Knowledge by Mimicking Features Nov 3, 2020 Knowledge Distillation object-detection
Code Code Available 0Learning to Maximize Speech Quality Directly Using MOS Prediction for Neural Text-to-Speech Nov 2, 2020 Knowledge Distillation Speech Synthesis
— Unverified 0Data-free Knowledge Distillation for Segmentation using Data-Enriching GAN Nov 2, 2020 Data-free Knowledge Distillation Diversity
Code Code Available 0The NiuTrans Machine Translation Systems for WMT20 Nov 1, 2020 Knowledge Distillation Machine Translation
— Unverified 0IIE’s Neural Machine Translation Systems for WMT20 Nov 1, 2020 Domain Adaptation Knowledge Distillation
— Unverified 0HW-TSC’s Participation in the WMT 2020 News Translation Shared Task Nov 1, 2020 Knowledge Distillation Translation
— Unverified 0High Performance Natural Language Processing Nov 1, 2020 Knowledge Distillation Quantization
— Unverified 0Using the Past Knowledge to Improve Sentiment Classification Nov 1, 2020 Classification Knowledge Distillation
— Unverified 0Distilling Structured Knowledge for Text-Based Relational Reasoning Nov 1, 2020 Contrastive Learning Knowledge Distillation
— Unverified 0Fast End-to-end Coreference Resolution for Korean Nov 1, 2020 coreference-resolution Coreference Resolution
— Unverified 0Bridging the Gap between Prior and Posterior Knowledge Selection for Knowledge-Grounded Dialogue Generation Nov 1, 2020 Decoder Dialogue Generation
— Unverified 0FedED: Federated Learning via Ensemble Distillation for Medical Relation Extraction Nov 1, 2020 Federated Learning Knowledge Distillation
— Unverified 0MixKD: Towards Efficient Distillation of Large-scale Language Models Nov 1, 2020 Data Augmentation Knowledge Distillation
— Unverified 0