Federated Knowledge Distillation Nov 4, 2020 Federated Learning Knowledge Distillation
Code Code Available 1Paralinguistic Privacy Protection at the Edge Nov 4, 2020 CPU Knowledge Distillation
— Unverified 0On Self-Distilling Graph Neural Network Nov 4, 2020 Graph Embedding Graph Neural Network
— Unverified 0Channel Planting for Deep Neural Networks using Knowledge Distillation Nov 4, 2020 Knowledge Distillation Network Pruning
— Unverified 0Domain Adaptive Knowledge Distillation for Driving Scene Semantic Segmentation Nov 3, 2020 Autonomous Driving Knowledge Distillation
Code Code Available 1A Comprehensive Study of Class Incremental Learning Algorithms for Visual Tasks Nov 3, 2020 class-incremental learning Class Incremental Learning
— Unverified 0Distilling Knowledge by Mimicking Features Nov 3, 2020 Knowledge Distillation object-detection
Code Code Available 0Data-free Knowledge Distillation for Segmentation using Data-Enriching GAN Nov 2, 2020 Data-free Knowledge Distillation Diversity
Code Code Available 0Learning to Maximize Speech Quality Directly Using MOS Prediction for Neural Text-to-Speech Nov 2, 2020 Knowledge Distillation Speech Synthesis
— Unverified 0HW-TSC’s Participation in the WMT 2020 News Translation Shared Task Nov 1, 2020 Knowledge Distillation Translation
— Unverified 0The NiuTrans Machine Translation Systems for WMT20 Nov 1, 2020 Knowledge Distillation Machine Translation
— Unverified 0IIE’s Neural Machine Translation Systems for WMT20 Nov 1, 2020 Domain Adaptation Knowledge Distillation
— Unverified 0High Performance Natural Language Processing Nov 1, 2020 Knowledge Distillation Quantization
— Unverified 0Fast End-to-end Coreference Resolution for Korean Nov 1, 2020 coreference-resolution Coreference Resolution
— Unverified 0Using the Past Knowledge to Improve Sentiment Classification Nov 1, 2020 Classification Knowledge Distillation
— Unverified 0FedED: Federated Learning via Ensemble Distillation for Medical Relation Extraction Nov 1, 2020 Federated Learning Knowledge Distillation
— Unverified 0Bridging the Gap between Prior and Posterior Knowledge Selection for Knowledge-Grounded Dialogue Generation Nov 1, 2020 Decoder Dialogue Generation
— Unverified 0Distilling Structured Knowledge for Text-Based Relational Reasoning Nov 1, 2020 Contrastive Learning Knowledge Distillation
— Unverified 0MixKD: Towards Efficient Distillation of Large-scale Language Models Nov 1, 2020 Data Augmentation Knowledge Distillation
— Unverified 0ProxylessKD: Direct Knowledge Distillation with Inherited Classifier for Face Recognition Oct 31, 2020 Face Recognition Knowledge Distillation
— Unverified 0Cross-lingual Machine Reading Comprehension with Language Branch Knowledge Distillation Oct 27, 2020 Knowledge Distillation Machine Reading Comprehension
— Unverified 0Activation Map Adaptation for Effective Knowledge Distillation Oct 26, 2020 Knowledge Distillation Model Compression
— Unverified 0FastFormers: Highly Efficient Transformer Models for Natural Language Understanding Oct 26, 2020 CPU GPU
Code Code Available 1Empowering Knowledge Distillation via Open Set Recognition for Robust 3D Point Cloud Classification Oct 25, 2020 3D Point Cloud Classification General Classification
— Unverified 0Two-stage Textual Knowledge Distillation for End-to-End Spoken Language Understanding Oct 25, 2020 Automatic Speech Recognition Automatic Speech Recognition (ASR)
Code Code Available 0Multi-Task Learning with Shared Encoder for Non-Autoregressive Machine Translation Oct 24, 2020 Knowledge Distillation Machine Translation
Code Code Available 1Pre-trained Summarization Distillation Oct 24, 2020 Knowledge Distillation Machine Translation
Code Code Available 0Improved Synthetic Training for Reading Comprehension Oct 24, 2020 Knowledge Distillation Machine Reading Comprehension
— Unverified 0Iterative Graph Self-Distillation Oct 23, 2020 Contrastive Learning Graph Learning
— Unverified 0Generating Long Financial Report using Conditional Variational Autoencoders with Knowledge Distillation Oct 23, 2020 Decoder Knowledge Distillation
— Unverified 0Distilling Dense Representations for Ranking using Tightly-Coupled Teachers Oct 22, 2020 Knowledge Distillation
Code Code Available 1Knowledge Distillation for BERT Unsupervised Domain Adaptation Oct 22, 2020 Domain Adaptation General Classification
Code Code Available 1Knowledge Distillation for Improved Accuracy in Spoken Question Answering Oct 21, 2020 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Contextualized Attention-based Knowledge Transfer for Spoken Conversational Question Answering Oct 21, 2020 Audio Signal Processing Conversational Question Answering
— Unverified 0Fast Video Salient Object Detection via Spatiotemporal Knowledge Distillation Oct 20, 2020 Knowledge Distillation Object
— Unverified 0Knowledge Distillation in Wide Neural Networks: Risk Bound, Data Efficiency and Imperfect Teacher Oct 20, 2020 Knowledge Distillation Model Compression
— Unverified 0Edge Bias in Federated Learning and its Solution by Buffered Knowledge Distillation Oct 20, 2020 Federated Learning Knowledge Distillation
— Unverified 0Noisy Neural Network Compression for Analog Storage Devices Oct 19, 2020 Knowledge Distillation Model Compression
— Unverified 0Comparing Fisher Information Regularization with Distillation for DNN Quantization Oct 19, 2020 Knowledge Distillation Quantization
— Unverified 0Infusing Sequential Information into Conditional Masked Translation Model with Self-Review Mechanism Oct 19, 2020 Decoder Knowledge Distillation
Code Code Available 0Reducing the Teacher-Student Gap via Spherical Knowledge Disitllation Oct 15, 2020 Knowledge Distillation
Code Code Available 1Task Decoupled Knowledge Distillation For Lightweight Face Detectors Oct 14, 2020 Face Detection Knowledge Distillation
Code Code Available 1AutoADR: Automatic Model Design for Ad Relevance Oct 14, 2020 AutoML Knowledge Distillation
— Unverified 0MulDE: Multi-teacher Knowledge Distillation for Low-dimensional Knowledge Graph Embeddings Oct 14, 2020 Graph Embedding Knowledge Distillation
— Unverified 0Dual-mode ASR: Unify and Improve Streaming ASR with Full-context Modeling Oct 12, 2020 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Collective Wisdom: Improving Low-resource Neural Machine Translation using Adaptive Knowledge Distillation Oct 12, 2020 Knowledge Distillation Low Resource Neural Machine Translation
— Unverified 0Structural Knowledge Distillation: Tractably Distilling Information for Structured Predictor Oct 10, 2020 Dependency Parsing Knowledge Distillation
Code Code Available 0Adversarial Self-Supervised Data-Free Distillation for Text Classification Oct 10, 2020 Classification General Classification
— Unverified 0Distilling a Deep Neural Network into a Takagi-Sugeno-Kang Fuzzy Inference System Oct 10, 2020 General Classification Knowledge Distillation
— Unverified 0Locally Linear Region Knowledge Distillation Oct 9, 2020 Knowledge Distillation
— Unverified 0