Language Model Prior for Low-Resource Neural Machine Translation Apr 30, 2020 Knowledge Distillation Language Modeling
Code Code Available 1Distilling Knowledge from Refinement in Multiple Instance Detection Networks Apr 23, 2020 Knowledge Distillation Multiple Instance Learning
Code Code Available 1Making Monolingual Sentence Embeddings Multilingual using Knowledge Distillation Apr 21, 2020 Knowledge Distillation Sentence
Code Code Available 1Role-Wise Data Augmentation for Knowledge Distillation Apr 19, 2020 Data Augmentation Knowledge Distillation
Code Code Available 1Triplet Loss for Knowledge Distillation Apr 17, 2020 Knowledge Distillation Metric Learning
Code Code Available 1Multimodal and multiview distillation for real-time player detection on a football field Apr 16, 2020 Data Augmentation Knowledge Distillation
Code Code Available 1Dark Experience for General Continual Learning: a Strong, Simple Baseline Apr 15, 2020 class-incremental learning Class Incremental Learning
Code Code Available 1Inter-Region Affinity Distillation for Road Marking Segmentation Apr 11, 2020 Knowledge Distillation Lane Detection
Code Code Available 1KD-MRI: A knowledge distillation framework for image reconstruction and image restoration in MRI workflow Apr 11, 2020 CPU GPU
Code Code Available 1Structure-Level Knowledge Distillation For Multilingual Sequence Labeling Apr 8, 2020 Aspect Extraction Knowledge Distillation
Code Code Available 1On the Effect of Dropping Layers of Pre-trained Transformer Models Apr 8, 2020 Knowledge Distillation Sentence
Code Code Available 1Towards Efficient Unconstrained Palmprint Recognition via Deep Distillation Hashing Apr 7, 2020 Knowledge Distillation
Code Code Available 1Temporally Distributed Networks for Fast Video Semantic Segmentation Apr 3, 2020 Knowledge Distillation Real-Time Semantic Segmentation
Code Code Available 1More Grounded Image Captioning by Distilling Image-Text Matching Model Apr 1, 2020 Image Captioning Image-text matching
Code Code Available 1Creating Something from Nothing: Unsupervised Knowledge Distillation for Cross-Modal Hashing Apr 1, 2020 Knowledge Distillation Retrieval
Code Code Available 1Regularizing Class-wise Predictions via Self-knowledge Distillation Mar 31, 2020 image-classification Image Classification
Code Code Available 1Neural Networks Are More Productive Teachers Than Human Raters: Active Mixup for Data-Efficient Knowledge Distillation from a Blackbox Model Mar 31, 2020 Active Learning Knowledge Distillation
Code Code Available 1Distilled Semantics for Comprehensive Scene Understanding from Videos Mar 31, 2020 Depth Estimation Knowledge Distillation
Code Code Available 1Circumventing Outliers of AutoAugment with Knowledge Distillation Mar 25, 2020 Data Augmentation General Classification
Code Code Available 1Distilling Knowledge from Graph Convolutional Networks Mar 23, 2020 Knowledge Distillation Transfer Learning
Code Code Available 1Collaborative Distillation for Ultra-Resolution Universal Style Transfer Mar 18, 2020 Decoder GPU
Code Code Available 1Incremental Object Detection via Meta-Learning Mar 17, 2020 Incremental Learning Knowledge Distillation
Code Code Available 1Deformation Flow Based Two-Stream Network for Lip Reading Mar 12, 2020 Knowledge Distillation Lipreading
Code Code Available 1SuperMix: Supervising the Mixing Data Augmentation Mar 10, 2020 Data Augmentation General Classification
Code Code Available 1Faster ILOD: Incremental Learning for Object Detectors based on Faster RCNN Mar 9, 2020 Incremental Learning Knowledge Distillation
Code Code Available 1PoseNet3D: Learning Temporally Consistent 3D Human Pose via Knowledge Distillation Mar 7, 2020 3D Human Pose Estimation Knowledge Distillation
Code Code Available 1Efficient Semantic Video Segmentation with Per-frame Inference Feb 26, 2020 Knowledge Distillation Optical Flow Estimation
Code Code Available 1Knapsack Pruning with Inner Distillation Feb 19, 2020 GPU Knowledge Distillation
Code Code Available 1Salvaging Federated Learning by Local Adaptation Feb 12, 2020 Federated Learning Knowledge Distillation
Code Code Available 1Knowledge Distillation for Brain Tumor Segmentation Feb 10, 2020 Brain Tumor Segmentation Knowledge Distillation
Code Code Available 1SUOD: Toward Scalable Unsupervised Outlier Detection Feb 8, 2020 Knowledge Distillation Outlier Detection
Code Code Available 1BERT-of-Theseus: Compressing BERT by Progressive Module Replacing Feb 7, 2020 Knowledge Distillation Model Compression
Code Code Available 1Learning From Multiple Experts: Self-paced Knowledge Distillation for Long-tailed Classification Jan 6, 2020 General Classification Knowledge Distillation
Code Code Available 1Unpaired Multi-modal Segmentation via Knowledge Distillation Jan 6, 2020 Image Segmentation Knowledge Distillation
Code Code Available 1Blockwisely Supervised Neural Architecture Search with Knowledge Distillation Nov 29, 2019 Knowledge Distillation Neural Architecture Search
Code Code Available 1Go From the General to the Particular: Multi-Domain Translation with Domain Transformation Networks Nov 22, 2019 Decoder General Knowledge
Code Code Available 1Preparing Lessons: Improve Knowledge Distillation with Better Supervision Nov 18, 2019 Knowledge Distillation
Code Code Available 1Maintaining Discrimination and Fairness in Class Incremental Learning Nov 16, 2019 class-incremental learning Class Incremental Learning
Code Code Available 1Learning from a Teacher using Unlabeled Data Nov 13, 2019 Knowledge Distillation Model Compression
Code Code Available 1Data Diversification: A Simple Strategy For Neural Machine Translation Nov 5, 2019 Knowledge Distillation Machine Translation
Code Code Available 1Contrastive Representation Distillation Oct 23, 2019 Contrastive Learning Knowledge Distillation
Code Code Available 1FedMD: Heterogenous Federated Learning via Model Distillation Oct 8, 2019 Federated Learning Knowledge Distillation
Code Code Available 1DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter Oct 2, 2019 Hate Speech Detection Knowledge Distillation
Code Code Available 1Distilled Split Deep Neural Networks for Edge-Assisted Real-Time Systems Oct 1, 2019 Edge-computing Image Classification
Code Code Available 1Distillation-Based Training for Multi-Exit Architectures Oct 1, 2019 Knowledge Distillation
Code Code Available 1Improved Techniques for Training Adaptive Deep Networks Aug 17, 2019 Computational Efficiency Knowledge Distillation
Code Code Available 1When Does Label Smoothing Help? Jun 6, 2019 image-classification Image Classification
Code Code Available 1Adversarially Robust Distillation May 23, 2019 Adversarial Robustness Knowledge Distillation
Code Code Available 1Knowledge Distillation via Route Constrained Optimization Apr 19, 2019 Face Recognition Knowledge Distillation
Code Code Available 1Fast Neural Architecture Search of Compact Semantic Segmentation Models via Auxiliary Cells Oct 25, 2018 Depth Estimation Depth Prediction
Code Code Available 1