One General Teacher for Multi-Data Multi-Task: A New Knowledge Distillation Framework for Discourse Relation Analysis Nov 16, 2021 Knowledge Distillation Multi-Task Learning
— Unverified 0Self-Distilled Pruning of Neural Networks Nov 16, 2021 Knowledge Distillation Language Modeling
— Unverified 0Making Small Language Models Better Few-Shot Learners Nov 16, 2021 Few-Shot Learning Knowledge Distillation
— Unverified 0Feature Structure Distillation for BERT Transferring Nov 16, 2021 Knowledge Distillation
— Unverified 0Multi-stage Distillation Framework for Cross-Lingual Semantic Similarity Matching Nov 16, 2021 Contrastive Learning Knowledge Distillation
— Unverified 0Learning to Teach with Student Feedback Nov 16, 2021 Knowledge Distillation
— Unverified 0Sparse Progressive Distillation: Resolving Overfitting under Pretrain-and-Finetune Paradigm Nov 16, 2021 Knowledge Distillation
— Unverified 0Aligned Weight Regularizers for Pruning Pretrained Neural Networks Nov 16, 2021 Knowledge Distillation Language Modeling
— Unverified 0NVIDIA NeMo Neural Machine Translation Systems for English-German and English-Russian News and Biomedical Tasks at WMT21 Nov 16, 2021 Data Augmentation Knowledge Distillation
— Unverified 0Synthetic Unknown Class Learning for Learning Unknowns Nov 15, 2021 Diversity Knowledge Distillation
— Unverified 0Robust and Accurate Object Detection via Self-Knowledge Distillation Nov 14, 2021 Adversarial Robustness Knowledge Distillation
Code Code Available 0Facial Landmark Points Detection Using Knowledge Distillation-Based Neural Networks Nov 13, 2021 Face Alignment Facial Landmark Detection
Code Code Available 0Learning Interpretation with Explainable Knowledge Distillation Nov 12, 2021 Knowledge Distillation Model Compression
— Unverified 0Domain Generalization on Efficient Acoustic Scene Classification using Residual Normalization Nov 12, 2021 Acoustic Scene Classification Classification
— Unverified 0Incremental Meta-Learning via Episodic Replay Distillation for Few-Shot Image Recognition Nov 9, 2021 Continual Learning Knowledge Distillation
Code Code Available 0On Representation Knowledge Distillation for Graph Neural Networks Nov 9, 2021 Contrastive Learning Knowledge Distillation
Code Code Available 1A Survey on Green Deep Learning Nov 8, 2021 Deep Learning Knowledge Distillation
— Unverified 0Class Token and Knowledge Distillation for Multi-head Self-Attention Speaker Verification Systems Nov 6, 2021 Knowledge Distillation Philosophy
— Unverified 0Oracle Teacher: Leveraging Target Information for Better Knowledge Distillation of CTC Models Nov 5, 2021 Knowledge Distillation Machine Translation
— Unverified 0Visualizing the Emergence of Intermediate Visual Patterns in DNNs Nov 5, 2021 Knowledge Distillation
— Unverified 0DVFL: A Vertical Federated Learning Method for Dynamic Data Nov 5, 2021 Federated Learning Knowledge Distillation
— Unverified 0AUTOKD: Automatic Knowledge Distillation Into A Student Architecture Family Nov 5, 2021 Bayesian Optimization Knowledge Distillation
— Unverified 0A methodology for training homomorphicencryption friendly neural networks Nov 5, 2021 Knowledge Distillation Privacy Preserving
— Unverified 0Leveraging Advantages of Interactive and Non-Interactive Models for Vector-Based Cross-Lingual Information Retrieval Nov 3, 2021 Computational Efficiency Cross-Lingual Information Retrieval
— Unverified 0LTD: Low Temperature Distillation for Robust Adversarial Training Nov 3, 2021 Knowledge Distillation
— Unverified 0Knowledge Cross-Distillation for Membership Privacy Nov 2, 2021 Inference Attack Knowledge Distillation
— Unverified 0The LMU Munich System for the WMT 2021 Large-Scale Multilingual Machine Translation Shared Task Nov 1, 2021 Data Augmentation Knowledge Distillation
— Unverified 0The NiuTrans System for the WMT 2021 Efficiency Task Nov 1, 2021 GPU Knowledge Distillation
— Unverified 0NVIDIA NeMo’s Neural Machine Translation Systems for English-German and English-Russian News and Biomedical Tasks at WMT21 Nov 1, 2021 Data Augmentation Knowledge Distillation
— Unverified 0Papago’s Submission for the WMT21 Quality Estimation Shared Task Nov 1, 2021 Knowledge Distillation Multi-Task Learning
— Unverified 0The Mininglamp Machine Translation System for WMT21 Nov 1, 2021 Knowledge Distillation Machine Translation
— Unverified 0HW-TSC’s Participation in the WMT 2021 News Translation Shared Task Nov 1, 2021 de-en Knowledge Distillation
— Unverified 0HW-TSC’s Participation in the WMT 2021 Large-Scale Multilingual Translation Task Nov 1, 2021 Knowledge Distillation Translation
— Unverified 0TenTrans Large-Scale Multilingual Machine Translation System for WMT21 Nov 1, 2021 Knowledge Distillation Machine Translation
— Unverified 0Efficient Machine Translation with Model Pruning and Quantization Nov 1, 2021 CPU Decoder
— Unverified 0AUTOSUMM: Automatic Model Creation for Text Summarization Nov 1, 2021 Abstractive Text Summarization Deep Learning
— Unverified 0Students Who Study Together Learn Better: On the Importance of Collective Knowledge Distillation for Domain Transfer in Fact Verification Nov 1, 2021 Fact Verification Knowledge Distillation
— Unverified 0Universal-KD: Attention-based Output-Grounded Intermediate Layer Knowledge Distillation Nov 1, 2021 Knowledge Distillation
— Unverified 0Exploring Non-Autoregressive Text Style Transfer Nov 1, 2021 Contrastive Learning Knowledge Distillation
Code Code Available 0Collaborative Learning of Bidirectional Decoders for Unsupervised Text Style Transfer Nov 1, 2021 Attribute Decoder
Code Code Available 0deepQuest-py: Large and Distilled Models for Quality Estimation Nov 1, 2021 Knowledge Distillation Sentence
Code Code Available 0PDALN: Progressive Domain Adaptation over a Pre-trained Model for Low-Resource Cross-Domain Named Entity Recognition Nov 1, 2021 Cross-Domain Named Entity Recognition Data Augmentation
— Unverified 0Domain-Lifelong Learning for Dialogue State Tracking via Knowledge Preservation Networks Nov 1, 2021 Dialogue State Tracking Diversity
Code Code Available 0GAML-BERT: Improving BERT Early Exiting by Gradient Aligned Mutual Learning Nov 1, 2021 Knowledge Distillation
— Unverified 0Improving Stance Detection with Multi-Dataset Learning and Knowledge Distillation Nov 1, 2021 Knowledge Distillation Stance Detection
Code Code Available 0Mutual-Learning Improves End-to-End Speech Translation Nov 1, 2021 Knowledge Distillation Machine Translation
— Unverified 0RW-KD: Sample-wise Loss Terms Re-Weighting for Knowledge Distillation Nov 1, 2021 Knowledge Distillation
— Unverified 0Combining Curriculum Learning and Knowledge Distillation for Dialogue Generation Nov 1, 2021 Dialogue Generation Knowledge Distillation
— Unverified 0Distilling Knowledge for Empathy Detection Nov 1, 2021 Knowledge Distillation
Code Code Available 0Multilingual Neural Machine Translation: Can Linguistic Hierarchies Help? Nov 1, 2021 Knowledge Distillation Machine Translation
— Unverified 0