Parser-Free Virtual Try-on via Distilling Appearance Flows Mar 8, 2021 Human Parsing Knowledge Distillation
Code Code Available 1Semantic-aware Knowledge Distillation for Few-Shot Class-Incremental Learning Mar 6, 2021 class-incremental learning Class Incremental Learning
— Unverified 0Adaptive Multi-Teacher Multi-level Knowledge Distillation Mar 6, 2021 Knowledge Distillation
Code Code Available 1Teachers Do More Than Teach: Compressing Image-to-Image Models Mar 5, 2021 Knowledge Distillation
Code Code Available 1Distributed Dynamic Map Fusion via Federated Learning for Intelligent Networked Vehicles Mar 5, 2021 Federated Learning Knowledge Distillation
Code Code Available 1Deep Neural Network Models Compression Mar 4, 2021 Knowledge Distillation Quantization
— Unverified 0Extract the Knowledge of Graph Neural Networks and Go Beyond it: An Effective Knowledge Distillation Framework Mar 4, 2021 Knowledge Distillation Node Classification
Code Code Available 1General Instance Distillation for Object Detection Mar 3, 2021 Knowledge Distillation Model Compression
Code Code Available 1Feature-Align Network with Knowledge Distillation for Efficient Denoising Mar 2, 2021 Decoder Denoising
— Unverified 0Exploring Complementary Strengths of Invariant and Equivariant Representations for Few-Shot Learning Mar 1, 2021 Few-Shot Image Classification Few-Shot Learning
Code Code Available 1Embedded Knowledge Distillation in Depth-Level Dynamic Neural Network Mar 1, 2021 Dynamic neural networks Knowledge Distillation
— Unverified 0Training Generative Adversarial Networks in One Stage Feb 28, 2021 Data-free Knowledge Distillation Image Generation
Code Code Available 1Alignment Knowledge Distillation for Online Streaming Attention-based Speech Recognition Feb 28, 2021 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Distilling Knowledge via Intermediate Classifiers Feb 28, 2021 Knowledge Distillation Transfer Learning
Code Code Available 1PURSUhInT: In Search of Informative Hint Points Based on Layer Clustering for Knowledge Distillation Feb 26, 2021 Clustering Knowledge Distillation
— Unverified 0Knowledge Distillation Circumvents Nonlinearity for Optical Convolutional Neural Networks Feb 26, 2021 Computational Efficiency Knowledge Distillation
— Unverified 0Even your Teacher Needs Guidance: Ground-Truth Targets Dampen Regularization Imposed by Self-Distillation Feb 25, 2021 Knowledge Distillation Self-Knowledge Distillation
Code Code Available 1Localization Distillation for Dense Object Detection Feb 24, 2021 Dense Object Detection Knowledge Distillation
Code Code Available 1Enhancing Data-Free Adversarial Distillation with Activation Regularization and Virtual Interpolation Feb 23, 2021 Knowledge Distillation
— Unverified 0Multi-View Feature Representation for Dialogue Generation with Bidirectional Distillation Feb 22, 2021 Dialogue Generation General Knowledge
— Unverified 0CheXseg: Combining Expert Annotations with DNN-generated Saliency Maps for X-ray Segmentation Feb 21, 2021 Image Segmentation Knowledge Distillation
Code Code Available 1Exploring Knowledge Distillation of a Deep Neural Network for Multi-Script identification Feb 20, 2021 Knowledge Distillation Transfer Learning
— Unverified 0End-to-End Automatic Speech Recognition with Deep Mutual Learning Feb 16, 2021 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Hierarchical Transformer-based Large-Context End-to-end ASR with Large-Context Knowledge Distillation Feb 16, 2021 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Improved Customer Transaction Classification using Semi-Supervised Knowledge Distillation Feb 15, 2021 Classification General Classification
— Unverified 0CAP-GAN: Towards Adversarial Robustness with Cycle-consistent Attentional Purification Feb 15, 2021 Adversarial Attack Adversarial Robustness
— Unverified 0Leveraging Acoustic and Linguistic Embeddings from Pretrained speech and language Models for Intent Classification Feb 15, 2021 Classification General Classification
— Unverified 0Self Regulated Learning Mechanism for Data Efficient Knowledge Distillation Feb 14, 2021 Knowledge Distillation Transfer Learning
— Unverified 0Learning Student-Friendly Teacher Networks for Knowledge Distillation Feb 12, 2021 Knowledge Distillation Transfer Learning
— Unverified 0Semantically-Conditioned Negative Samples for Efficient Contrastive Learning Feb 12, 2021 Contrastive Learning Knowledge Distillation
— Unverified 0NewsBERT: Distilling Pre-trained Language Model for Intelligent News Application Feb 9, 2021 Articles Knowledge Distillation
— Unverified 0Show, Attend and Distill:Knowledge Distillation via Attention-based Feature Matching Feb 5, 2021 General Knowledge Knowledge Distillation
Code Code Available 1ML-Doctor: Holistic Risk Assessment of Inference Attacks Against Machine Learning Models Feb 4, 2021 Attribute BIG-bench Machine Learning
Code Code Available 1Do Not Forget to Attend to Uncertainty while Mitigating Catastrophic Forgetting Feb 3, 2021 Deep Learning Incremental Learning
— Unverified 0Rethinking Soft Labels for Knowledge Distillation: A Bias-Variance Tradeoff Perspective Feb 1, 2021 Knowledge Distillation
Code Code Available 1Evolutionary Generative Adversarial Networks with Crossover Based Knowledge Distillation Jan 27, 2021 Knowledge Distillation
Code Code Available 0ISP Distillation Jan 25, 2021 Knowledge Distillation Object Recognition
— Unverified 0Network-Agnostic Knowledge Transfer for Medical Image Segmentation Jan 23, 2021 Image Segmentation Knowledge Distillation
— Unverified 0Memory-Efficient Semi-Supervised Continual Learning: The World is its Own Replay Buffer Jan 23, 2021 Continual Learning Knowledge Distillation
Code Code Available 1Bridging the gap between Human Action Recognition and Online Action Detection Jan 21, 2021 Action Detection Action Recognition
— Unverified 0Collaborative Teacher-Student Learning via Multiple Knowledge Transfer Jan 21, 2021 Knowledge Distillation Model Compression
— Unverified 0Deep Epidemiological Modeling by Black-box Knowledge Distillation: An Accurate Deep Learning Model for COVID-19 Jan 20, 2021 Diversity Knowledge Distillation
— Unverified 0Learning to Augment for Data-Scarce Domain BERT Knowledge Distillation Jan 20, 2021 Knowledge Distillation
— Unverified 0Knowledge Distillation Methods for Efficient Unsupervised Adaptation Across Multiple Domains Jan 18, 2021 Domain Adaptation image-classification
— Unverified 0Incremental Knowledge Based Question Answering Jan 18, 2021 Incremental Learning Knowledge Distillation
— Unverified 0KDLSQ-BERT: A Quantized Bert Combining Knowledge Distillation with Learned Step Size Quantization Jan 15, 2021 Knowledge Distillation Language Modelling
— Unverified 0Mining Data Impressions from Deep Models as Substitute for the Unavailable Training Data Jan 15, 2021 Adversarial Robustness Continual Learning
— Unverified 0SEED: Self-supervised Distillation For Visual Representation Jan 12, 2021 Knowledge Distillation Self-Supervised Learning
Code Code Available 1Interpretable discovery of new semiconductors with machine learning Jan 12, 2021 BIG-bench Machine Learning Knowledge Distillation
— Unverified 0Resolution-Based Distillation for Efficient Histology Image Classification Jan 11, 2021 Classification Computational Efficiency
— Unverified 0