ProxylessKD: Direct Knowledge Distillation with Inherited Classifier for Face Recognition Oct 31, 2020 Face Recognition Knowledge Distillation
— Unverified 0Cross-lingual Machine Reading Comprehension with Language Branch Knowledge Distillation Oct 27, 2020 Knowledge Distillation Machine Reading Comprehension
— Unverified 0Activation Map Adaptation for Effective Knowledge Distillation Oct 26, 2020 Knowledge Distillation Model Compression
— Unverified 0Empowering Knowledge Distillation via Open Set Recognition for Robust 3D Point Cloud Classification Oct 25, 2020 3D Point Cloud Classification General Classification
— Unverified 0Two-stage Textual Knowledge Distillation for End-to-End Spoken Language Understanding Oct 25, 2020 Automatic Speech Recognition Automatic Speech Recognition (ASR)
Code Code Available 0Pre-trained Summarization Distillation Oct 24, 2020 Knowledge Distillation Machine Translation
Code Code Available 0Improved Synthetic Training for Reading Comprehension Oct 24, 2020 Knowledge Distillation Machine Reading Comprehension
— Unverified 0Iterative Graph Self-Distillation Oct 23, 2020 Contrastive Learning Graph Learning
— Unverified 0Generating Long Financial Report using Conditional Variational Autoencoders with Knowledge Distillation Oct 23, 2020 Decoder Knowledge Distillation
— Unverified 0Knowledge Distillation for Improved Accuracy in Spoken Question Answering Oct 21, 2020 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Contextualized Attention-based Knowledge Transfer for Spoken Conversational Question Answering Oct 21, 2020 Audio Signal Processing Conversational Question Answering
— Unverified 0Knowledge Distillation in Wide Neural Networks: Risk Bound, Data Efficiency and Imperfect Teacher Oct 20, 2020 Knowledge Distillation Model Compression
— Unverified 0Fast Video Salient Object Detection via Spatiotemporal Knowledge Distillation Oct 20, 2020 Knowledge Distillation Object
— Unverified 0Edge Bias in Federated Learning and its Solution by Buffered Knowledge Distillation Oct 20, 2020 Federated Learning Knowledge Distillation
— Unverified 0Noisy Neural Network Compression for Analog Storage Devices Oct 19, 2020 Knowledge Distillation Model Compression
— Unverified 0Comparing Fisher Information Regularization with Distillation for DNN Quantization Oct 19, 2020 Knowledge Distillation Quantization
— Unverified 0Infusing Sequential Information into Conditional Masked Translation Model with Self-Review Mechanism Oct 19, 2020 Decoder Knowledge Distillation
Code Code Available 0AutoADR: Automatic Model Design for Ad Relevance Oct 14, 2020 AutoML Knowledge Distillation
— Unverified 0MulDE: Multi-teacher Knowledge Distillation for Low-dimensional Knowledge Graph Embeddings Oct 14, 2020 Graph Embedding Knowledge Distillation
— Unverified 0Dual-mode ASR: Unify and Improve Streaming ASR with Full-context Modeling Oct 12, 2020 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Collective Wisdom: Improving Low-resource Neural Machine Translation using Adaptive Knowledge Distillation Oct 12, 2020 Knowledge Distillation Low Resource Neural Machine Translation
— Unverified 0Structural Knowledge Distillation: Tractably Distilling Information for Structured Predictor Oct 10, 2020 Dependency Parsing Knowledge Distillation
Code Code Available 0Adversarial Self-Supervised Data-Free Distillation for Text Classification Oct 10, 2020 Classification General Classification
— Unverified 0Distilling a Deep Neural Network into a Takagi-Sugeno-Kang Fuzzy Inference System Oct 10, 2020 General Classification Knowledge Distillation
— Unverified 0Locally Linear Region Knowledge Distillation Oct 9, 2020 Knowledge Distillation
— Unverified 0Be Your Own Best Competitor! Multi-Branched Adversarial Knowledge Transfer Oct 9, 2020 Decoder image-classification
— Unverified 0Galileo at SemEval-2020 Task 12: Multi-lingual Learning for Offensive Language Identification using Pre-trained Language Models Oct 7, 2020 All Knowledge Distillation
— Unverified 0DiPair: Fast and Accurate Distillation for Trillion-Scale Text Matching and Pair Modeling Oct 7, 2020 Knowledge Distillation Question Answering
— Unverified 0Deep Representation Learning of Patient Data from Electronic Health Records (EHR): A Systematic Review Oct 6, 2020 Articles Deep Learning
— Unverified 0Why Skip If You Can Combine: A Simple Knowledge Distillation Technique for Intermediate Layers Oct 6, 2020 Knowledge Distillation Machine Translation
Code Code Available 0A Survey on Deep Neural Network Compression: Challenges, Overview, and Solutions Oct 5, 2020 Knowledge Distillation Miscellaneous
— Unverified 0Towards Cross-modality Medical Image Segmentation with Online Mutual Knowledge Distillation Oct 4, 2020 Cardiac Segmentation Image Segmentation
— Unverified 0Neighbourhood Distillation: On the benefits of non end-to-end distillation Oct 2, 2020 Knowledge Distillation Neural Architecture Search
— Unverified 0Online Knowledge Distillation via Multi-branch Diversity Enhancement Oct 2, 2020 Diversity image-classification
— Unverified 0WeChat Neural Machine Translation Systems for WMT20 Oct 1, 2020 Knowledge Distillation Machine Translation
— Unverified 0Improved Knowledge Distillation via Full Kernel Matrix Transfer Sep 30, 2020 Knowledge Distillation Model Compression
Code Code Available 0Pea-KD: Parameter-efficient and Accurate Knowledge Distillation on BERT Sep 30, 2020 Knowledge Distillation Model Compression
— Unverified 0Stochastic Precision Ensemble: Self-Knowledge Distillation for Quantized Deep Neural Networks Sep 30, 2020 image-classification Image Classification
— Unverified 0Distillation of Weighted Automata from Recurrent Neural Networks using a Spectral Approach Sep 28, 2020 Knowledge Distillation Language Modelling
— Unverified 0Kernel Based Progressive Distillation for Adder Neural Networks Sep 28, 2020 Knowledge Distillation
— Unverified 0Pea-KD: Parameter-efficient and accurate Knowledge Distillation Sep 28, 2020 Knowledge Distillation Model Compression
— Unverified 0TernaryBERT: Distillation-aware Ultra-low Bit BERT Sep 27, 2020 Knowledge Distillation Quantization
Code Code Available 0Multi-Frame to Single-Frame: Knowledge Distillation for 3D Object Detection Sep 24, 2020 3D Object Detection Autonomous Driving
— Unverified 0Sim-to-Real Transfer in Deep Reinforcement Learning for Robotics: a Survey Sep 24, 2020 Deep Reinforcement Learning Domain Adaptation
— Unverified 0Open-set Short Utterance Forensic Speaker Verification using Teacher-Student Network with Explicit Inductive Bias Sep 21, 2020 Inductive Bias Knowledge Distillation
— Unverified 0EI-MTD:Moving Target Defense for Edge Intelligence against Adversarial Attacks Sep 19, 2020 Knowledge Distillation Scheduling
— Unverified 0Weight Distillation: Transferring the Knowledge in Neural Network Parameters Sep 19, 2020 Knowledge Distillation Machine Translation
— Unverified 0Introspective Learning by Distilling Knowledge from Online Self-explanation Sep 19, 2020 Knowledge Distillation
— Unverified 0Efficient Transformer-based Large Scale Language Representations using Hardware-friendly Block Structured Pruning Sep 17, 2020 Edge-computing Knowledge Distillation
— Unverified 0Mimic and Conquer: Heterogeneous Tree Structure Distillation for Syntactic NLP Sep 16, 2020 Knowledge Distillation
— Unverified 0