Neural Architecture Search via Ensemble-based Knowledge Distillation Sep 29, 2021 Diversity Knowledge Distillation
— Unverified 0Feature Kernel Distillation Sep 29, 2021 image-classification Image Classification
— Unverified 0Pseudo Knowledge Distillation: Towards Learning Optimal Instance-specific Label Smoothing Regularization Sep 29, 2021 image-classification Image Classification
— Unverified 0Prototypical Contrastive Predictive Coding Sep 29, 2021 Contrastive Learning Knowledge Distillation
— Unverified 0Improving Question Answering Performance Using Knowledge Distillation and Active Learning Sep 26, 2021 Active Learning Knowledge Distillation
Code Code Available 0Partial to Whole Knowledge Distillation: Progressive Distilling Decomposed Knowledge Boosts Student Better Sep 26, 2021 Knowledge Distillation
— Unverified 0Recent Advances of Continual Learning in Computer Vision: An Overview Sep 23, 2021 Continual Learning Knowledge Distillation
— Unverified 0KD-VLP: Improving End-to-End Vision-and-Language Pretraining with Object Knowledge Distillation Sep 22, 2021 cross-modal alignment Knowledge Distillation
Code Code Available 0The NiuTrans Machine Translation Systems for WMT21 Sep 22, 2021 Knowledge Distillation Machine Translation
— Unverified 0K-AID: Enhancing Pre-trained Language Models with Domain Knowledge for Question Answering Sep 22, 2021 CPU Knowledge Distillation
— Unverified 0Low-Latency Incremental Text-to-Speech Synthesis with Distilled Context Prediction Network Sep 22, 2021 Knowledge Distillation Language Modeling
— Unverified 0Knowledge Distillation with Noisy Labels for Natural Language Understanding Sep 21, 2021 Knowledge Distillation Natural Language Understanding
— Unverified 0RAIL-KD: RAndom Intermediate Layer Mapping for Knowledge Distillation Sep 21, 2021 Knowledge Distillation
— Unverified 0Releasing Graph Neural Networks with Differential Privacy Guarantees Sep 18, 2021 Knowledge Distillation Privacy Preserving
Code Code Available 0Towards Full Utilization on Mask Task for Distilling PLMs into NMT Sep 17, 2021 Knowledge Distillation Machine Translation
— Unverified 0Label Assignment Distillation for Object Detection Sep 16, 2021 Knowledge Distillation Object
— Unverified 0New Perspective on Progressive GANs Distillation for One-class Novelty Detection Sep 15, 2021 Decoder Generative Adversarial Network
— Unverified 0AligNART: Non-autoregressive Neural Machine Translation by Jointly Learning to Estimate Alignment and Translate Sep 14, 2021 Decoder Knowledge Distillation
— Unverified 0Multihop: Leveraging Complex Models to Learn Accurate Simple Models Sep 14, 2021 Explainable artificial intelligence Knowledge Distillation
— Unverified 0A Note on Knowledge Distillation Loss Function for Object Classification Sep 14, 2021 Knowledge Distillation Model Compression
— Unverified 0Secure Your Ride: Real-time Matching Success Rate Prediction for Passenger-Driver Pairs Sep 14, 2021 Decision Making Knowledge Distillation
— Unverified 0UniMS: A Unified Framework for Multimodal Summarization with Knowledge Distillation Sep 13, 2021 Abstractive Text Summarization Decoder
— Unverified 0KroneckerBERT: Learning Kronecker Decomposition for Pre-trained Language Models via Knowledge Distillation Sep 13, 2021 Knowledge Distillation Language Modeling
— Unverified 0On the Efficiency of Subclass Knowledge Distillation in Classification Tasks Sep 12, 2021 Binary Classification Classification
— Unverified 0Federated Ensemble Model-based Reinforcement Learning in Edge Computing Sep 12, 2021 Autonomous Driving continuous-control
— Unverified 0Towards Developing a Multilingual and Code-Mixed Visual Question Answering System by Knowledge Distillation Sep 10, 2021 Knowledge Distillation Question Answering
— Unverified 0Learning to Teach with Student Feedback Sep 10, 2021 Knowledge Distillation
— Unverified 0Dual Correction Strategy for Ranking Distillation in Top-N Recommender System Sep 8, 2021 Knowledge Distillation Recommendation Systems
Code Code Available 0CAM-loss: Towards Learning Spatially Discriminative Feature Representations Sep 3, 2021 Few-Shot Learning image-classification
— Unverified 0Complementary Calibration: Boosting General Continual Learning with Collaborative Distillation and Self-Supervision Sep 3, 2021 Continual Learning Contrastive Learning
Code Code Available 0Decoupled Transformer for Scalable Inference in Open-domain Question Answering Sep 1, 2021 Knowledge Distillation Machine Reading Comprehension
— Unverified 0Catastrophic Interference in Reinforcement Learning: A Solution Based on Context Division and Knowledge Distillation Sep 1, 2021 Deep Reinforcement Learning General Reinforcement Learning
Code Code Available 0Knowledge Distillation with BERT for Image Tag-Based Privacy Prediction Sep 1, 2021 Knowledge Distillation TAG
— Unverified 0FedKD: Communication Efficient Federated Learning via Knowledge Distillation Aug 30, 2021 Federated Learning Knowledge Distillation
— Unverified 0Lipschitz Continuity Guided Knowledge Distillation Aug 29, 2021 Knowledge Distillation Model Compression
— Unverified 0Distilling the Knowledge of Large-scale Generative Models into Retrieval Models for Efficient Open-domain Conversation Aug 28, 2021 Knowledge Distillation Retrieval
Code Code Available 0CoCo DistillNet: a Cross-layer Correlation Distillation Network for Pathological Gastric Cancer Segmentation Aug 27, 2021 Image Segmentation Knowledge Distillation
— Unverified 0SIGN: Spatial-information Incorporated Generative Network for Generalized Zero-shot Semantic Segmentation Aug 27, 2021 Knowledge Distillation Segmentation
— Unverified 0Efficient training of lightweight neural networks using Online Self-Acquired Knowledge Distillation Aug 26, 2021 Density Estimation Knowledge Distillation
— Unverified 0Deploying a BERT-based Query-Title Relevance Classifier in a Production System: a View from the Trenches Aug 23, 2021 CPU Data Augmentation
— Unverified 0Personalised Federated Learning: A Combinational Approach Aug 22, 2021 Federated Learning Knowledge Distillation
— Unverified 0Boosting of Head Pose Estimation by Knowledge Distillation Aug 20, 2021 Head Pose Estimation Knowledge Distillation
— Unverified 0G-DetKD: Towards General Distillation Framework for Object Detectors via Contrastive and Semantic-guided Feature Imitation Aug 17, 2021 Knowledge Distillation object-detection
— Unverified 0BERT Learns to Teach: Knowledge Distillation with Meta Learning Aug 17, 2021 Knowledge Distillation Meta-Learning
— Unverified 0Online Continual Learning For Visual Food Classification Aug 15, 2021 Classification Continual Learning
— Unverified 0Multi-granularity for knowledge distillation Aug 15, 2021 Knowledge Distillation Person Re-Identification
Code Code Available 0PAIR: Leveraging Passage-Centric Similarity Relation for Improving Dense Passage Retrieval Aug 13, 2021 Knowledge Distillation Natural Questions
— Unverified 0Learning from Matured Dumb Teacher for Fine Generalization Aug 12, 2021 image-classification Image Classification
— Unverified 0Preventing Catastrophic Forgetting and Distribution Mismatch in Knowledge Distillation via Synthetic Data Aug 11, 2021 Knowledge Distillation Model Compression
— Unverified 0Lifelong Intent Detection via Multi-Strategy Rebalancing Aug 10, 2021 Intent Detection Knowledge Distillation
— Unverified 0