Symbolic Knowledge Distillation: from General Language Models to Commonsense Models Oct 14, 2021 Knowledge Distillation Knowledge Graphs
Code Code Available 1FocusNet: Classifying Better by Focusing on Confusing Classes Oct 14, 2021 Classification image-classification
Code Code Available 1Object DGCNN: 3D Object Detection using Dynamic Graphs Oct 13, 2021 2D Object Detection 3D Object Detection
Code Code Available 1Towards Accurate Cross-Domain In-Bed Human Pose Estimation Oct 7, 2021 Data Augmentation Knowledge Distillation
Code Code Available 1KNOT: Knowledge Distillation using Optimal Transport for Solving NLP Tasks Oct 6, 2021 Emotion Recognition Emotion Recognition in Conversation
Code Code Available 1Prune Your Model Before Distill It Sep 30, 2021 Knowledge Distillation model
Code Code Available 1Multilingual AMR Parsing with Noisy Knowledge Distillation Sep 30, 2021 AMR Parsing Knowledge Distillation
Code Code Available 1Deep Structured Instance Graph for Distilling Object Detectors Sep 27, 2021 Instance Segmentation Knowledge Distillation
Code Code Available 1Dynamic Knowledge Distillation for Pre-trained Language Models Sep 23, 2021 Knowledge Distillation
Code Code Available 1Segmentation with mixed supervision: Confidence maximization helps knowledge distillation Sep 21, 2021 Image Segmentation Knowledge Distillation
Code Code Available 1Distilling Linguistic Context for Language Model Compression Sep 17, 2021 Knowledge Distillation Language Modeling
Code Code Available 1The NiuTrans System for the WMT21 Efficiency Task Sep 16, 2021 GPU Knowledge Distillation
Code Code Available 1The NiuTrans System for WNGT 2020 Efficiency Task Sep 16, 2021 Decoder Knowledge Distillation
Code Code Available 1EfficientBERT: Progressively Searching Multilayer Perceptron via Warm-up Knowledge Distillation Sep 15, 2021 Data Augmentation Knowledge Distillation
Code Code Available 1Multi-Scale Aligned Distillation for Low-Resolution Detection Sep 14, 2021 Knowledge Distillation object-detection
Code Code Available 1How to Select One Among All? An Extensive Empirical Study Towards the Robustness of Knowledge Distillation in Natural Language Understanding Sep 13, 2021 Adversarial Robustness All
Code Code Available 1Knowledge Distillation Using Hierarchical Self-Supervision Augmented Distribution Sep 7, 2021 image-classification Image Classification
Code Code Available 1Beyond Preserved Accuracy: Evaluating Loyalty and Robustness of BERT Compression Sep 7, 2021 Knowledge Distillation Quantization
Code Code Available 1Black-Box Attacks on Sequential Recommenders via Data-Free Model Extraction Sep 1, 2021 Data Poisoning Knowledge Distillation
Code Code Available 1Cross-category Video Highlight Detection via Set-based Learning Aug 26, 2021 Domain Adaptation Highlight Detection
Code Code Available 1PocketNet: Extreme Lightweight Face Recognition Network using Neural Architecture Search and Multi-Step Knowledge Distillation Aug 24, 2021 Face Recognition Knowledge Distillation
Code Code Available 1Efficient Medical Image Segmentation Based on Knowledge Distillation Aug 23, 2021 Image Segmentation Knowledge Distillation
Code Code Available 1Supervised Compression for Resource-Constrained Edge Computing Systems Aug 21, 2021 Data Compression Edge-computing
Code Code Available 1Learning Conditional Knowledge Distillation for Degraded-Reference Image Quality Assessment Aug 18, 2021 Image Quality Assessment Image Restoration
Code Code Available 1Revisiting Adversarial Robustness Distillation: Robust Soft Labels Make Student Better Aug 18, 2021 Adversarial Robustness Knowledge Distillation
Code Code Available 1AGKD-BML: Defense Against Adversarial Attack by Attention Guided Knowledge Distillation and Bi-directional Metric Learning Aug 13, 2021 Adversarial Attack Adversarial Robustness
Code Code Available 1Distilling Holistic Knowledge with Graph Neural Networks Aug 12, 2021 Knowledge Distillation
Code Code Available 1Transferring Knowledge Distillation for Multilingual Social Event Detection Aug 6, 2021 Cross-Lingual Word Embeddings Event Detection
Code Code Available 1Knowledge Distillation from BERT Transformer to Speech Transformer for Intent Classification Aug 5, 2021 Automatic Speech Recognition Automatic Speech Recognition (ASR)
Code Code Available 1Learning Compatible Embeddings Aug 4, 2021 Knowledge Distillation Retrieval
Code Code Available 1Online Knowledge Distillation for Efficient Pose Estimation Aug 4, 2021 Knowledge Distillation Pose Estimation
Code Code Available 1Hierarchical Self-supervised Augmented Knowledge Distillation Jul 29, 2021 Knowledge Distillation Representation Learning
Code Code Available 1Consensual Collaborative Training And Knowledge Distillation Based Facial Expression Recognition Under Noisy Annotations Jul 10, 2021 Facial Expression Recognition Facial Expression Recognition (FER)
Code Code Available 1Categorical Relation-Preserving Contrastive Knowledge Distillation for Medical Image Classification Jul 7, 2021 Classification image-classification
Code Code Available 1VidLanKD: Improving Language Understanding via Video-Distilled Knowledge Transfer Jul 6, 2021 Image Retrieval Knowledge Distillation
Code Code Available 1Split-and-Bridge: Adaptable Class Incremental Learning within a Single Neural Network Jul 3, 2021 class-incremental learning Class Incremental Learning
Code Code Available 1Learning Efficient Vision Transformers via Fine-Grained Manifold Distillation Jul 3, 2021 Knowledge Distillation Model Compression
Code Code Available 1DnS: Distill-and-Select for Efficient and Accurate Video Indexing and Retrieval Jun 24, 2021 Computational Efficiency Knowledge Distillation
Code Code Available 1SSUL: Semantic Segmentation with Unknown Label for Exemplar-based Class-Incremental Learning Jun 22, 2021 class-incremental learning Class Incremental Learning
Code Code Available 1Structured Sparse R-CNN for Direct Scene Graph Generation Jun 21, 2021 graph construction Graph Generation
Code Code Available 1Context-Aware Image Inpainting with Learned Semantic Priors Jun 14, 2021 Image Inpainting Knowledge Distillation
Code Code Available 1Does Knowledge Distillation Really Work? Jun 10, 2021 Knowledge Distillation
Code Code Available 1Distilling Image Classifiers in Object Detectors Jun 9, 2021 Knowledge Distillation Object
Code Code Available 1BERT Learns to Teach: Knowledge Distillation with Meta Learning Jun 8, 2021 Knowledge Distillation Meta-Learning
Code Code Available 1XtremeDistilTransformers: Task Transfer for Task-agnostic Distillation Jun 8, 2021 Knowledge Distillation NER
Code Code Available 1Zero-Shot Knowledge Distillation from a Decision-Based Black-Box Model Jun 7, 2021 Knowledge Distillation
Code Code Available 1Preservation of the Global Knowledge by Not-True Distillation in Federated Learning Jun 6, 2021 Continual Learning Federated Learning
Code Code Available 1Bidirectional Distillation for Top-K Recommender System Jun 5, 2021 Knowledge Distillation Model Compression
Code Code Available 1Towards Quantifiable Dialogue Coherence Evaluation Jun 1, 2021 Coherence Evaluation Dialogue Evaluation
Code Code Available 1Transformer-Based Source-Free Domain Adaptation May 28, 2021 Domain Adaptation Knowledge Distillation
Code Code Available 1