Segmentation with mixed supervision: Confidence maximization helps knowledge distillation Sep 21, 2021 Image Segmentation Knowledge Distillation
Code Code Available 1RAIL-KD: RAndom Intermediate Layer Mapping for Knowledge Distillation Sep 21, 2021 Knowledge Distillation
— Unverified 0Knowledge Distillation with Noisy Labels for Natural Language Understanding Sep 21, 2021 Knowledge Distillation Natural Language Understanding
— Unverified 0Releasing Graph Neural Networks with Differential Privacy Guarantees Sep 18, 2021 Knowledge Distillation Privacy Preserving
Code Code Available 0Towards Full Utilization on Mask Task for Distilling PLMs into NMT Sep 17, 2021 Knowledge Distillation Machine Translation
— Unverified 0Distilling Linguistic Context for Language Model Compression Sep 17, 2021 Knowledge Distillation Language Modeling
Code Code Available 1Label Assignment Distillation for Object Detection Sep 16, 2021 Knowledge Distillation Object
— Unverified 0The NiuTrans System for WNGT 2020 Efficiency Task Sep 16, 2021 Decoder Knowledge Distillation
Code Code Available 1The NiuTrans System for the WMT21 Efficiency Task Sep 16, 2021 GPU Knowledge Distillation
Code Code Available 1New Perspective on Progressive GANs Distillation for One-class Novelty Detection Sep 15, 2021 Decoder Generative Adversarial Network
— Unverified 0EfficientBERT: Progressively Searching Multilayer Perceptron via Warm-up Knowledge Distillation Sep 15, 2021 Data Augmentation Knowledge Distillation
Code Code Available 1Secure Your Ride: Real-time Matching Success Rate Prediction for Passenger-Driver Pairs Sep 14, 2021 Decision Making Knowledge Distillation
— Unverified 0Multi-Scale Aligned Distillation for Low-Resolution Detection Sep 14, 2021 Knowledge Distillation object-detection
Code Code Available 1Multihop: Leveraging Complex Models to Learn Accurate Simple Models Sep 14, 2021 Explainable artificial intelligence Knowledge Distillation
— Unverified 0A Note on Knowledge Distillation Loss Function for Object Classification Sep 14, 2021 Knowledge Distillation Model Compression
— Unverified 0AligNART: Non-autoregressive Neural Machine Translation by Jointly Learning to Estimate Alignment and Translate Sep 14, 2021 Decoder Knowledge Distillation
— Unverified 0UniMS: A Unified Framework for Multimodal Summarization with Knowledge Distillation Sep 13, 2021 Abstractive Text Summarization Decoder
— Unverified 0How to Select One Among All? An Extensive Empirical Study Towards the Robustness of Knowledge Distillation in Natural Language Understanding Sep 13, 2021 Adversarial Robustness All
Code Code Available 1KroneckerBERT: Learning Kronecker Decomposition for Pre-trained Language Models via Knowledge Distillation Sep 13, 2021 Knowledge Distillation Language Modeling
— Unverified 0On the Efficiency of Subclass Knowledge Distillation in Classification Tasks Sep 12, 2021 Binary Classification Classification
— Unverified 0Federated Ensemble Model-based Reinforcement Learning in Edge Computing Sep 12, 2021 Autonomous Driving continuous-control
— Unverified 0Learning to Teach with Student Feedback Sep 10, 2021 Knowledge Distillation
— Unverified 0Towards Developing a Multilingual and Code-Mixed Visual Question Answering System by Knowledge Distillation Sep 10, 2021 Knowledge Distillation Question Answering
— Unverified 0LibFewShot: A Comprehensive Library for Few-shot Learning Sep 10, 2021 Data Augmentation Few-Shot Image Classification
Code Code Available 2Dual Correction Strategy for Ranking Distillation in Top-N Recommender System Sep 8, 2021 Knowledge Distillation Recommendation Systems
Code Code Available 0Knowledge Distillation Using Hierarchical Self-Supervision Augmented Distribution Sep 7, 2021 image-classification Image Classification
Code Code Available 1Beyond Preserved Accuracy: Evaluating Loyalty and Robustness of BERT Compression Sep 7, 2021 Knowledge Distillation Quantization
Code Code Available 1Complementary Calibration: Boosting General Continual Learning with Collaborative Distillation and Self-Supervision Sep 3, 2021 Continual Learning Contrastive Learning
Code Code Available 0CAM-loss: Towards Learning Spatially Discriminative Feature Representations Sep 3, 2021 Few-Shot Learning image-classification
— Unverified 0Knowledge Distillation with BERT for Image Tag-Based Privacy Prediction Sep 1, 2021 Knowledge Distillation TAG
— Unverified 0Decoupled Transformer for Scalable Inference in Open-domain Question Answering Sep 1, 2021 Knowledge Distillation Machine Reading Comprehension
— Unverified 0Black-Box Attacks on Sequential Recommenders via Data-Free Model Extraction Sep 1, 2021 Data Poisoning Knowledge Distillation
Code Code Available 1Catastrophic Interference in Reinforcement Learning: A Solution Based on Context Division and Knowledge Distillation Sep 1, 2021 Deep Reinforcement Learning General Reinforcement Learning
Code Code Available 0FedKD: Communication Efficient Federated Learning via Knowledge Distillation Aug 30, 2021 Federated Learning Knowledge Distillation
— Unverified 0Lipschitz Continuity Guided Knowledge Distillation Aug 29, 2021 Knowledge Distillation Model Compression
— Unverified 0Distilling the Knowledge of Large-scale Generative Models into Retrieval Models for Efficient Open-domain Conversation Aug 28, 2021 Knowledge Distillation Retrieval
Code Code Available 0SIGN: Spatial-information Incorporated Generative Network for Generalized Zero-shot Semantic Segmentation Aug 27, 2021 Knowledge Distillation Segmentation
— Unverified 0CoCo DistillNet: a Cross-layer Correlation Distillation Network for Pathological Gastric Cancer Segmentation Aug 27, 2021 Image Segmentation Knowledge Distillation
— Unverified 0Efficient training of lightweight neural networks using Online Self-Acquired Knowledge Distillation Aug 26, 2021 Density Estimation Knowledge Distillation
— Unverified 0Cross-category Video Highlight Detection via Set-based Learning Aug 26, 2021 Domain Adaptation Highlight Detection
Code Code Available 1PocketNet: Extreme Lightweight Face Recognition Network using Neural Architecture Search and Multi-Step Knowledge Distillation Aug 24, 2021 Face Recognition Knowledge Distillation
Code Code Available 1Deploying a BERT-based Query-Title Relevance Classifier in a Production System: a View from the Trenches Aug 23, 2021 CPU Data Augmentation
— Unverified 0Efficient Medical Image Segmentation Based on Knowledge Distillation Aug 23, 2021 Image Segmentation Knowledge Distillation
Code Code Available 1Personalised Federated Learning: A Combinational Approach Aug 22, 2021 Federated Learning Knowledge Distillation
— Unverified 0Supervised Compression for Resource-Constrained Edge Computing Systems Aug 21, 2021 Data Compression Edge-computing
Code Code Available 1Boosting of Head Pose Estimation by Knowledge Distillation Aug 20, 2021 Head Pose Estimation Knowledge Distillation
— Unverified 0Revisiting Adversarial Robustness Distillation: Robust Soft Labels Make Student Better Aug 18, 2021 Adversarial Robustness Knowledge Distillation
Code Code Available 1Learning Conditional Knowledge Distillation for Degraded-Reference Image Quality Assessment Aug 18, 2021 Image Quality Assessment Image Restoration
Code Code Available 1BERT Learns to Teach: Knowledge Distillation with Meta Learning Aug 17, 2021 Knowledge Distillation Meta-Learning
— Unverified 0G-DetKD: Towards General Distillation Framework for Object Detectors via Contrastive and Semantic-guided Feature Imitation Aug 17, 2021 Knowledge Distillation object-detection
— Unverified 0