Be Your Own Best Competitor! Multi-Branched Adversarial Knowledge Transfer Oct 9, 2020 Decoder image-classification
— Unverified 0DiPair: Fast and Accurate Distillation for Trillion-Scale Text Matching and Pair Modeling Oct 7, 2020 Knowledge Distillation Question Answering
— Unverified 0Galileo at SemEval-2020 Task 12: Multi-lingual Learning for Offensive Language Identification using Pre-trained Language Models Oct 7, 2020 All Knowledge Distillation
— Unverified 0Improving Efficient Neural Ranking Models with Cross-Architecture Knowledge Distillation Oct 6, 2020 Knowledge Distillation Passage Ranking
Code Code Available 1Deep Representation Learning of Patient Data from Electronic Health Records (EHR): A Systematic Review Oct 6, 2020 Articles Deep Learning
— Unverified 0Why Skip If You Can Combine: A Simple Knowledge Distillation Technique for Intermediate Layers Oct 6, 2020 Knowledge Distillation Machine Translation
Code Code Available 0A Survey on Deep Neural Network Compression: Challenges, Overview, and Solutions Oct 5, 2020 Knowledge Distillation Miscellaneous
— Unverified 0Improving Neural Topic Models using Knowledge Distillation Oct 5, 2020 Knowledge Distillation Topic Models
Code Code Available 1Self-training Improves Pre-training for Natural Language Understanding Oct 5, 2020 Data Augmentation Few-Shot Learning
Code Code Available 1Lifelong Language Knowledge Distillation Oct 5, 2020 Knowledge Distillation Language Modelling
Code Code Available 1Towards Cross-modality Medical Image Segmentation with Online Mutual Knowledge Distillation Oct 4, 2020 Cardiac Segmentation Image Segmentation
— Unverified 0Neighbourhood Distillation: On the benefits of non end-to-end distillation Oct 2, 2020 Knowledge Distillation Neural Architecture Search
— Unverified 0Online Knowledge Distillation via Multi-branch Diversity Enhancement Oct 2, 2020 Diversity image-classification
— Unverified 0WeChat Neural Machine Translation Systems for WMT20 Oct 1, 2020 Knowledge Distillation Machine Translation
— Unverified 0Improved Knowledge Distillation via Full Kernel Matrix Transfer Sep 30, 2020 Knowledge Distillation Model Compression
Code Code Available 0Stochastic Precision Ensemble: Self-Knowledge Distillation for Quantized Deep Neural Networks Sep 30, 2020 image-classification Image Classification
— Unverified 0Pea-KD: Parameter-efficient and Accurate Knowledge Distillation on BERT Sep 30, 2020 Knowledge Distillation Model Compression
— Unverified 0TinyGAN: Distilling BigGAN for Conditional Image Generation Sep 29, 2020 Conditional Image Generation Image Generation
Code Code Available 1Contrastive Distillation on Intermediate Representations for Language Model Compression Sep 29, 2020 Knowledge Distillation Language Modeling
Code Code Available 1Pea-KD: Parameter-efficient and accurate Knowledge Distillation Sep 28, 2020 Knowledge Distillation Model Compression
— Unverified 0Kernel Based Progressive Distillation for Adder Neural Networks Sep 28, 2020 Knowledge Distillation
— Unverified 0Distillation of Weighted Automata from Recurrent Neural Networks using a Spectral Approach Sep 28, 2020 Knowledge Distillation Language Modelling
— Unverified 0TernaryBERT: Distillation-aware Ultra-low Bit BERT Sep 27, 2020 Knowledge Distillation Quantization
Code Code Available 0N-LTP: An Open-source Neural Language Technology Platform for Chinese Sep 24, 2020 Chinese Word Segmentation Dependency Parsing
Code Code Available 3Sim-to-Real Transfer in Deep Reinforcement Learning for Robotics: a Survey Sep 24, 2020 Deep Reinforcement Learning Domain Adaptation
— Unverified 0Multi-Frame to Single-Frame: Knowledge Distillation for 3D Object Detection Sep 24, 2020 3D Object Detection Autonomous Driving
— Unverified 0Open-set Short Utterance Forensic Speaker Verification using Teacher-Student Network with Explicit Inductive Bias Sep 21, 2020 Inductive Bias Knowledge Distillation
— Unverified 0EI-MTD:Moving Target Defense for Edge Intelligence against Adversarial Attacks Sep 19, 2020 Knowledge Distillation Scheduling
— Unverified 0Weight Distillation: Transferring the Knowledge in Neural Network Parameters Sep 19, 2020 Knowledge Distillation Machine Translation
— Unverified 0Introspective Learning by Distilling Knowledge from Online Self-explanation Sep 19, 2020 Knowledge Distillation
— Unverified 0Densely Guided Knowledge Distillation using Multiple Teacher Assistants Sep 18, 2020 Knowledge Distillation Model Compression
Code Code Available 1Efficient Transformer-based Large Scale Language Representations using Hardware-friendly Block Structured Pruning Sep 17, 2020 Edge-computing Knowledge Distillation
— Unverified 0MEAL V2: Boosting Vanilla ResNet-50 to 80%+ Top-1 Accuracy on ImageNet without Tricks Sep 17, 2020 Image Classification Knowledge Distillation
Code Code Available 1S2SD: Simultaneous Similarity-based Self-Distillation for Deep Metric Learning Sep 17, 2020 Knowledge Distillation Metric Learning
Code Code Available 1Mimic and Conquer: Heterogeneous Tree Structure Distillation for Syntactic NLP Sep 16, 2020 Knowledge Distillation
— Unverified 0Simplified TinyBERT: Knowledge Distillation for Document Retrieval Sep 16, 2020 Document Ranking Knowledge Distillation
Code Code Available 1Noisy Self-Knowledge Distillation for Text Summarization Sep 15, 2020 Knowledge Distillation Self-Knowledge Distillation
Code Code Available 1Collaborative Distillation in the Parameter and Spectrum Domains for Video Action Recognition Sep 15, 2020 Action Recognition Knowledge Distillation
— Unverified 0Autoregressive Knowledge Distillation through Imitation Learning Sep 15, 2020 Imitation Learning Knowledge Distillation
Code Code Available 0SSKD: Self-Supervised Knowledge Distillation for Cross Domain Adaptive Person Re-Identification Sep 13, 2020 Clustering Domain Adaptive Person Re-Identification
— Unverified 0BoostingBERT:Integrating Multi-Class Boosting into BERT for NLP Tasks Sep 13, 2020 Ensemble Learning Knowledge Distillation
— Unverified 0DualDE: Dually Distilling Knowledge Graph Embedding for Faster and Cheaper Reasoning Sep 13, 2020 Graph Embedding Knowledge Distillation
— Unverified 0Extending Label Smoothing Regularization with Self-Knowledge Distillation Sep 11, 2020 Knowledge Distillation Self-Knowledge Distillation
— Unverified 0On the Orthogonality of Knowledge Distillation with Other Techniques: From an Ensemble Perspective Sep 9, 2020 Data Augmentation Efficient Neural Network
— Unverified 0Simulating Unknown Target Models for Query-Efficient Black-box Attacks Sep 2, 2020 Knowledge Distillation Meta-Learning
Code Code Available 1SAIL: Self-Augmented Graph Contrastive Learning Sep 2, 2020 Contrastive Learning Knowledge Distillation
— Unverified 0Lifelong Object Detection Sep 2, 2020 Knowledge Distillation Lifelong learning
— Unverified 0Classification of Diabetic Retinopathy Using Unlabeled Data and Knowledge Distillation Sep 1, 2020 Classification General Classification
— Unverified 0Semantics-aware Adaptive Knowledge Distillation for Sensor-to-Vision Action Recognition Sep 1, 2020 Action Recognition Image Generation
Code Code Available 1Automatic Assignment of Radiology Examination Protocols Using Pre-trained Language Models with Knowledge Distillation Sep 1, 2020 Data Augmentation Knowledge Distillation
Code Code Available 0