CheXseg: Combining Expert Annotations with DNN-generated Saliency Maps for X-ray Segmentation Feb 21, 2021 Image Segmentation Knowledge Distillation
Code Code Available 1Show, Attend and Distill:Knowledge Distillation via Attention-based Feature Matching Feb 5, 2021 General Knowledge Knowledge Distillation
Code Code Available 1ML-Doctor: Holistic Risk Assessment of Inference Attacks Against Machine Learning Models Feb 4, 2021 Attribute BIG-bench Machine Learning
Code Code Available 1Rethinking Soft Labels for Knowledge Distillation: A Bias-Variance Tradeoff Perspective Feb 1, 2021 Knowledge Distillation
Code Code Available 1Memory-Efficient Semi-Supervised Continual Learning: The World is its Own Replay Buffer Jan 23, 2021 Continual Learning Knowledge Distillation
Code Code Available 1SEED: Self-supervised Distillation For Visual Representation Jan 12, 2021 Knowledge Distillation Self-Supervised Learning
Code Code Available 1Knowledge Distillation in Iterative Generative Models for Improved Sampling Speed Jan 7, 2021 Denoising Image Generation
Code Code Available 1Self-Mutual Distillation Learning for Continuous Sign Language Recognition Jan 1, 2021 Knowledge Distillation Sign Language Recognition
Code Code Available 1Exploring Inter-Channel Correlation for Diversity-Preserved Knowledge Distillation Jan 1, 2021 Diversity Knowledge Distillation
Code Code Available 1Improve Object Detection with Feature-based Knowledge Distillation: Towards Accurate and Efficient Detectors Jan 1, 2021 image-classification Image Classification
Code Code Available 1Unified Mandarin TTS Front-end Based on Distilled BERT Model Dec 31, 2020 Knowledge Distillation Language Modeling
Code Code Available 1CascadeBERT: Accelerating Inference of Pre-trained Language Models via Calibrated Complete Models Cascade Dec 29, 2020 Knowledge Distillation Model Selection
Code Code Available 1Learning Light-Weight Translation Models from Deep Transformer Dec 27, 2020 Knowledge Distillation Machine Translation
Code Code Available 1Invariant Teacher and Equivariant Student for Unsupervised 3D Human Pose Estimation Dec 17, 2020 3D Human Pose Estimation Knowledge Distillation
Code Code Available 1Computation-Efficient Knowledge Distillation via Uncertainty-Aware Mixup Dec 17, 2020 Informativeness Knowledge Distillation
Code Code Available 1Progressive Network Grafting for Few-Shot Knowledge Distillation Dec 9, 2020 Knowledge Distillation Model Compression
Code Code Available 1Distilling Knowledge from Reader to Retriever for Question Answering Dec 8, 2020 Information Retrieval Knowledge Distillation
Code Code Available 1DE-RRD: A Knowledge Distillation Framework for Recommender System Dec 8, 2020 Knowledge Distillation Model Compression
Code Code Available 1Cross-Layer Distillation with Semantic Calibration Dec 6, 2020 Knowledge Distillation Transfer Learning
Code Code Available 1What Makes a "Good" Data Augmentation in Knowledge Distillation -- A Statistical Perspective Dec 5, 2020 Active Learning Data Augmentation
Code Code Available 1Going Beyond Classification Accuracy Metrics in Model Compression Dec 3, 2020 Classification Edge-computing
Code Code Available 1Multi-level Knowledge Distillation via Knowledge Alignment and Correlation Dec 1, 2020 Contrastive Learning Knowledge Distillation
Code Code Available 1Agree to Disagree: Adaptive Ensemble Knowledge Distillation in Gradient Space Dec 1, 2020 Diversity Knowledge Distillation
Code Code Available 1Knowledge Base Embedding By Cooperative Knowledge Distillation Dec 1, 2020 Knowledge Distillation Representation Learning
Code Code Available 1Task-Oriented Feature Distillation Dec 1, 2020 3D Classification General Classification
Code Code Available 1KD-Lib: A PyTorch library for Knowledge Distillation, Pruning and Quantization Nov 30, 2020 Knowledge Distillation Model Compression
Code Code Available 1Prototype-based Incremental Few-Shot Semantic Segmentation Nov 30, 2020 Few-Shot Semantic Segmentation Incremental Learning
Code Code Available 1Channel-wise Knowledge Distillation for Dense Prediction Nov 26, 2020 Knowledge Distillation Prediction
Code Code Available 1Multiresolution Knowledge Distillation for Anomaly Detection Nov 22, 2020 Anomaly Detection Anomaly Localization
Code Code Available 1Evolving Search Space for Neural Architecture Search Nov 22, 2020 Knowledge Distillation Neural Architecture Search
Code Code Available 1Head Network Distillation: Splitting Distilled Deep Neural Networks for Resource-Constrained Edge Computing Systems Nov 20, 2020 Edge-computing image-classification
Code Code Available 1KD3A: Unsupervised Multi-Source Decentralized Domain Adaptation via Knowledge Distillation Nov 19, 2020 Domain Adaptation Knowledge Distillation
Code Code Available 1Anomaly Detection in Video via Self-Supervised and Multi-Task Learning Nov 15, 2020 Abnormal Event Detection In Video Anomaly Detection
Code Code Available 1Federated Knowledge Distillation Nov 4, 2020 Federated Learning Knowledge Distillation
Code Code Available 1Domain Adaptive Knowledge Distillation for Driving Scene Semantic Segmentation Nov 3, 2020 Autonomous Driving Knowledge Distillation
Code Code Available 1FastFormers: Highly Efficient Transformer Models for Natural Language Understanding Oct 26, 2020 CPU GPU
Code Code Available 1Multi-Task Learning with Shared Encoder for Non-Autoregressive Machine Translation Oct 24, 2020 Knowledge Distillation Machine Translation
Code Code Available 1Distilling Dense Representations for Ranking using Tightly-Coupled Teachers Oct 22, 2020 Knowledge Distillation
Code Code Available 1Knowledge Distillation for BERT Unsupervised Domain Adaptation Oct 22, 2020 Domain Adaptation General Classification
Code Code Available 1Reducing the Teacher-Student Gap via Spherical Knowledge Disitllation Oct 15, 2020 Knowledge Distillation
Code Code Available 1Task Decoupled Knowledge Distillation For Lightweight Face Detectors Oct 14, 2020 Face Detection Knowledge Distillation
Code Code Available 1Improving Efficient Neural Ranking Models with Cross-Architecture Knowledge Distillation Oct 6, 2020 Knowledge Distillation Passage Ranking
Code Code Available 1Improving Neural Topic Models using Knowledge Distillation Oct 5, 2020 Knowledge Distillation Topic Models
Code Code Available 1Lifelong Language Knowledge Distillation Oct 5, 2020 Knowledge Distillation Language Modelling
Code Code Available 1Self-training Improves Pre-training for Natural Language Understanding Oct 5, 2020 Data Augmentation Few-Shot Learning
Code Code Available 1Contrastive Distillation on Intermediate Representations for Language Model Compression Sep 29, 2020 Knowledge Distillation Language Modeling
Code Code Available 1TinyGAN: Distilling BigGAN for Conditional Image Generation Sep 29, 2020 Conditional Image Generation Image Generation
Code Code Available 1Densely Guided Knowledge Distillation using Multiple Teacher Assistants Sep 18, 2020 Knowledge Distillation Model Compression
Code Code Available 1MEAL V2: Boosting Vanilla ResNet-50 to 80%+ Top-1 Accuracy on ImageNet without Tricks Sep 17, 2020 Image Classification Knowledge Distillation
Code Code Available 1S2SD: Simultaneous Similarity-based Self-Distillation for Deep Metric Learning Sep 17, 2020 Knowledge Distillation Metric Learning
Code Code Available 1