Adversarially Robust and Explainable Model Compression with On-Device Personalization for Text Classification Jan 10, 2021 Adversarial Robustness General Classification
— Unverified 0Spending Your Winning Lottery Better After Drawing It Jan 8, 2021 Knowledge Distillation
Code Code Available 0Knowledge Distillation in Iterative Generative Models for Improved Sampling Speed Jan 7, 2021 Denoising Image Generation
Code Code Available 1MSD: Saliency-aware Knowledge Distillation for Multimodal Understanding Jan 6, 2021 Knowledge Distillation Meta-Learning
— Unverified 0Label Augmentation via Time-based Knowledge Distillation for Financial Anomaly Detection Jan 5, 2021 Anomaly Detection Knowledge Distillation
— Unverified 0Self-Mutual Distillation Learning for Continuous Sign Language Recognition Jan 1, 2021 Knowledge Distillation Sign Language Recognition
Code Code Available 1FLAR: A Unified Prototype Framework for Few-Sample Lifelong Active Recognition Jan 1, 2021 Knowledge Distillation Lifelong learning
— Unverified 0Unpaired Learning for Deep Image Deraining With Rain Direction Regularizer Jan 1, 2021 Knowledge Distillation Rain Removal
— Unverified 0Kernel Methods in Hyperbolic Spaces Jan 1, 2021 Few-Shot Learning image-classification
— Unverified 0Exploring Inter-Channel Correlation for Diversity-Preserved Knowledge Distillation Jan 1, 2021 Diversity Knowledge Distillation
Code Code Available 1Active Learning for Lane Detection: A Knowledge Distillation Approach Jan 1, 2021 2D Object Detection Active Learning
— Unverified 0Student Customized Knowledge Distillation: Bridging the Gap Between Student and Teacher Jan 1, 2021 image-classification Image Classification
— Unverified 0Improving De-Raining Generalization via Neural Reorganization Jan 1, 2021 Knowledge Distillation
— Unverified 0Distilling Global and Local Logits With Densely Connected Relations Jan 1, 2021 image-classification Image Classification
Code Code Available 0Rethinking Soft Labels for Knowledge Distillation: A Bias–Variance Tradeoff Perspective Jan 1, 2021 Knowledge Distillation
— Unverified 0Disentanglement, Visualization and Analysis of Complex Features in DNNs Jan 1, 2021 Disentanglement Knowledge Distillation
— Unverified 0Improve Object Detection with Feature-based Knowledge Distillation: Towards Accurate and Efficient Detectors Jan 1, 2021 image-classification Image Classification
Code Code Available 1Can Students Outperform Teachers in Knowledge Distillation based Model Compression? Jan 1, 2021 Knowledge Distillation Model Compression
— Unverified 0Contextual Knowledge Distillation for Transformer Compression Jan 1, 2021 Knowledge Distillation Language Modeling
— Unverified 0Explicit Connection Distillation Jan 1, 2021 image-classification Image Classification
— Unverified 0Knowledge distillation via softmax regression representation learning Jan 1, 2021 Knowledge Distillation Model Compression
— Unverified 0Knowledge Distillation based Ensemble Learning for Neural Machine Translation Jan 1, 2021 Ensemble Learning Knowledge Distillation
— Unverified 0Learning from deep model via exploring local targets Jan 1, 2021 Knowledge Distillation model
— Unverified 0Understanding Adversarial Attacks on Autoencoders Jan 1, 2021 Compressive Sensing Knowledge Distillation
— Unverified 0Long Live the Lottery: The Existence of Winning Tickets in Lifelong Learning Jan 1, 2021 class-incremental learning Class Incremental Learning
— Unverified 0Robust Overfitting may be mitigated by properly learned smoothening Jan 1, 2021 Knowledge Distillation
— Unverified 0Don't be picky, all students in the right family can learn from good teachers Jan 1, 2021 All Bayesian Optimization
— Unverified 0Understanding Knowledge Distillation Jan 1, 2021 Knowledge Distillation
— Unverified 0Unified Mandarin TTS Front-end Based on Distilled BERT Model Dec 31, 2020 Knowledge Distillation Language Modeling
Code Code Available 1Towards Zero-Shot Knowledge Distillation for Natural Language Processing Dec 31, 2020 Knowledge Distillation Model Compression
— Unverified 0Fully Synthetic Data Improves Neural Machine Translation with Knowledge Distillation Dec 31, 2020 Knowledge Distillation Machine Translation
— Unverified 0Knowledge Distillation with Adaptive Asymmetric Label Sharpening for Semi-supervised Fracture Detection in Chest X-rays Dec 30, 2020 Fracture detection Knowledge Distillation
— Unverified 0Understanding and Improving Lexical Choice in Non-Autoregressive Translation Dec 29, 2020 Knowledge Distillation Translation
— Unverified 0CascadeBERT: Accelerating Inference of Pre-trained Language Models via Calibrated Complete Models Cascade Dec 29, 2020 Knowledge Distillation Model Selection
Code Code Available 1Learning Light-Weight Translation Models from Deep Transformer Dec 27, 2020 Knowledge Distillation Machine Translation
Code Code Available 1ALP-KD: Attention-Based Layer Projection for Knowledge Distillation Dec 27, 2020 Knowledge Distillation
— Unverified 0Towards a Universal Continuous Knowledge Base Dec 25, 2020 Knowledge Distillation text-classification
— Unverified 0Future-Guided Incremental Transformer for Simultaneous Translation Dec 23, 2020 Knowledge Distillation Translation
— Unverified 0AttentionLite: Towards Efficient Self-Attention Models for Vision Dec 21, 2020 Knowledge Distillation
— Unverified 0Diverse Knowledge Distillation for End-to-End Person Search Dec 21, 2020 Human Detection Knowledge Distillation
— Unverified 0Computation-Efficient Knowledge Distillation via Uncertainty-Aware Mixup Dec 17, 2020 Informativeness Knowledge Distillation
Code Code Available 1Invariant Teacher and Equivariant Student for Unsupervised 3D Human Pose Estimation Dec 17, 2020 3D Human Pose Estimation Knowledge Distillation
Code Code Available 1Towards Understanding Ensemble, Knowledge Distillation and Self-Distillation in Deep Learning Dec 17, 2020 Deep Learning Knowledge Distillation
— Unverified 0Distilling Optimal Neural Networks: Rapid Search in Diverse Spaces Dec 16, 2020 GPU Knowledge Distillation
— Unverified 0Wasserstein Contrastive Representation Distillation Dec 15, 2020 Contrastive Learning Knowledge Distillation
— Unverified 0LRC-BERT: Latent-representation Contrastive Knowledge Distillation for Natural Language Understanding Dec 14, 2020 Contrastive Learning Knowledge Distillation
— Unverified 0Periocular Embedding Learning with Consistent Knowledge Distillation from Face Dec 12, 2020 Knowledge Distillation Prediction
— Unverified 0Improving Task-Agnostic BERT Distillation with Layer Mapping Search Dec 11, 2020 Knowledge Distillation
— Unverified 0Reinforced Multi-Teacher Selection for Knowledge Distillation Dec 11, 2020 GPU Knowledge Distillation
— Unverified 0Large-Scale Generative Data-Free Distillation Dec 10, 2020 Knowledge Distillation Model Compression
— Unverified 0