Collaborative Teacher-Student Learning via Multiple Knowledge Transfer Jan 21, 2021 Knowledge Distillation Model Compression
— Unverified 0Bridging the gap between Human Action Recognition and Online Action Detection Jan 21, 2021 Action Detection Action Recognition
— Unverified 0Deep Epidemiological Modeling by Black-box Knowledge Distillation: An Accurate Deep Learning Model for COVID-19 Jan 20, 2021 Diversity Knowledge Distillation
— Unverified 0Learning to Augment for Data-Scarce Domain BERT Knowledge Distillation Jan 20, 2021 Knowledge Distillation
— Unverified 0Incremental Knowledge Based Question Answering Jan 18, 2021 Incremental Learning Knowledge Distillation
— Unverified 0Knowledge Distillation Methods for Efficient Unsupervised Adaptation Across Multiple Domains Jan 18, 2021 Domain Adaptation image-classification
— Unverified 0Mining Data Impressions from Deep Models as Substitute for the Unavailable Training Data Jan 15, 2021 Adversarial Robustness Continual Learning
— Unverified 0KDLSQ-BERT: A Quantized Bert Combining Knowledge Distillation with Learned Step Size Quantization Jan 15, 2021 Knowledge Distillation Language Modelling
— Unverified 0Interpretable discovery of new semiconductors with machine learning Jan 12, 2021 BIG-bench Machine Learning Knowledge Distillation
— Unverified 0Resolution-Based Distillation for Efficient Histology Image Classification Jan 11, 2021 Classification Computational Efficiency
— Unverified 0Adversarially Robust and Explainable Model Compression with On-Device Personalization for Text Classification Jan 10, 2021 Adversarial Robustness General Classification
— Unverified 0Spending Your Winning Lottery Better After Drawing It Jan 8, 2021 Knowledge Distillation
Code Code Available 0MSD: Saliency-aware Knowledge Distillation for Multimodal Understanding Jan 6, 2021 Knowledge Distillation Meta-Learning
— Unverified 0Label Augmentation via Time-based Knowledge Distillation for Financial Anomaly Detection Jan 5, 2021 Anomaly Detection Knowledge Distillation
— Unverified 0Knowledge distillation via softmax regression representation learning Jan 1, 2021 Knowledge Distillation Model Compression
— Unverified 0Robust Overfitting may be mitigated by properly learned smoothening Jan 1, 2021 Knowledge Distillation
— Unverified 0FLAR: A Unified Prototype Framework for Few-Sample Lifelong Active Recognition Jan 1, 2021 Knowledge Distillation Lifelong learning
— Unverified 0Understanding Adversarial Attacks on Autoencoders Jan 1, 2021 Compressive Sensing Knowledge Distillation
— Unverified 0Understanding Knowledge Distillation Jan 1, 2021 Knowledge Distillation
— Unverified 0Long Live the Lottery: The Existence of Winning Tickets in Lifelong Learning Jan 1, 2021 class-incremental learning Class Incremental Learning
— Unverified 0Disentanglement, Visualization and Analysis of Complex Features in DNNs Jan 1, 2021 Disentanglement Knowledge Distillation
— Unverified 0Unpaired Learning for Deep Image Deraining With Rain Direction Regularizer Jan 1, 2021 Knowledge Distillation Rain Removal
— Unverified 0Explicit Connection Distillation Jan 1, 2021 image-classification Image Classification
— Unverified 0Rethinking Soft Labels for Knowledge Distillation: A Bias–Variance Tradeoff Perspective Jan 1, 2021 Knowledge Distillation
— Unverified 0Student Customized Knowledge Distillation: Bridging the Gap Between Student and Teacher Jan 1, 2021 image-classification Image Classification
— Unverified 0Learning from deep model via exploring local targets Jan 1, 2021 Knowledge Distillation model
— Unverified 0Improving De-Raining Generalization via Neural Reorganization Jan 1, 2021 Knowledge Distillation
— Unverified 0Can Students Outperform Teachers in Knowledge Distillation based Model Compression? Jan 1, 2021 Knowledge Distillation Model Compression
— Unverified 0Contextual Knowledge Distillation for Transformer Compression Jan 1, 2021 Knowledge Distillation Language Modeling
— Unverified 0Don't be picky, all students in the right family can learn from good teachers Jan 1, 2021 All Bayesian Optimization
— Unverified 0Active Learning for Lane Detection: A Knowledge Distillation Approach Jan 1, 2021 2D Object Detection Active Learning
— Unverified 0Knowledge Distillation based Ensemble Learning for Neural Machine Translation Jan 1, 2021 Ensemble Learning Knowledge Distillation
— Unverified 0Distilling Global and Local Logits With Densely Connected Relations Jan 1, 2021 image-classification Image Classification
Code Code Available 0Kernel Methods in Hyperbolic Spaces Jan 1, 2021 Few-Shot Learning image-classification
— Unverified 0Fully Synthetic Data Improves Neural Machine Translation with Knowledge Distillation Dec 31, 2020 Knowledge Distillation Machine Translation
— Unverified 0Towards Zero-Shot Knowledge Distillation for Natural Language Processing Dec 31, 2020 Knowledge Distillation Model Compression
— Unverified 0Knowledge Distillation with Adaptive Asymmetric Label Sharpening for Semi-supervised Fracture Detection in Chest X-rays Dec 30, 2020 Fracture detection Knowledge Distillation
— Unverified 0Understanding and Improving Lexical Choice in Non-Autoregressive Translation Dec 29, 2020 Knowledge Distillation Translation
— Unverified 0ALP-KD: Attention-Based Layer Projection for Knowledge Distillation Dec 27, 2020 Knowledge Distillation
— Unverified 0Towards a Universal Continuous Knowledge Base Dec 25, 2020 Knowledge Distillation text-classification
— Unverified 0Future-Guided Incremental Transformer for Simultaneous Translation Dec 23, 2020 Knowledge Distillation Translation
— Unverified 0AttentionLite: Towards Efficient Self-Attention Models for Vision Dec 21, 2020 Knowledge Distillation
— Unverified 0Diverse Knowledge Distillation for End-to-End Person Search Dec 21, 2020 Human Detection Knowledge Distillation
— Unverified 0Towards Understanding Ensemble, Knowledge Distillation and Self-Distillation in Deep Learning Dec 17, 2020 Deep Learning Knowledge Distillation
— Unverified 0Distilling Optimal Neural Networks: Rapid Search in Diverse Spaces Dec 16, 2020 GPU Knowledge Distillation
— Unverified 0Wasserstein Contrastive Representation Distillation Dec 15, 2020 Contrastive Learning Knowledge Distillation
— Unverified 0LRC-BERT: Latent-representation Contrastive Knowledge Distillation for Natural Language Understanding Dec 14, 2020 Contrastive Learning Knowledge Distillation
— Unverified 0Periocular Embedding Learning with Consistent Knowledge Distillation from Face Dec 12, 2020 Knowledge Distillation Prediction
— Unverified 0Improving Task-Agnostic BERT Distillation with Layer Mapping Search Dec 11, 2020 Knowledge Distillation
— Unverified 0Reinforced Multi-Teacher Selection for Knowledge Distillation Dec 11, 2020 GPU Knowledge Distillation
— Unverified 0