Topic Modeling for Maternal Health Using Reddit Apr 1, 2021 Knowledge Distillation
— Unverified 0Decentralized and Model-Free Federated Learning: Consensus-Based Distillation in Function Space Apr 1, 2021 Federated Learning Knowledge Distillation
— Unverified 0Unsupervised Domain Expansion for Visual Categorization Apr 1, 2021 Domain Adaptation Knowledge Distillation
Code Code Available 0Is Label Smoothing Truly Incompatible with Knowledge Distillation: An Empirical Study Apr 1, 2021 image-classification Image Classification
— Unverified 0Fixing the Teacher-Student Knowledge Discrepancy in Distillation Mar 31, 2021 image-classification Image Classification
— Unverified 0Knowledge Distillation By Sparse Representation Matching Mar 31, 2021 Knowledge Distillation Representation Learning
Code Code Available 0Industry Scale Semi-Supervised Learning for Natural Language Understanding Mar 29, 2021 intent-classification Intent Classification
— Unverified 0Distilling Virtual Examples for Long-tailed Recognition Mar 28, 2021 Knowledge Distillation Long-tail Learning
Code Code Available 0KnowRU: Knowledge Reusing via Knowledge Distillation in Multi-agent Reinforcement Learning Mar 27, 2021 Deep Reinforcement Learning Knowledge Distillation
— Unverified 0Weakly-Supervised Domain Adaptation of Deep Regression Trackers via Reinforced Knowledge Distillation Mar 26, 2021 Domain Adaptation Knowledge Distillation
— Unverified 0Hands-on Guidance for Distilling Object Detectors Mar 26, 2021 Knowledge Distillation Object
— Unverified 0Leaning Compact and Representative Features for Cross-Modality Person Re-Identification Mar 26, 2021 Cross-Modality Person Re-identification Knowledge Distillation
Code Code Available 0A Practical Survey on Faster and Lighter Transformers Mar 26, 2021 Knowledge Distillation Survey
— Unverified 0Spirit Distillation: Precise Real-time Semantic Segmentation of Road Scenes with Insufficient Data Mar 25, 2021 Autonomous Driving Few-Shot Learning
— Unverified 0The NLP Cookbook: Modern Recipes for Transformer based Deep Learning Architectures Mar 23, 2021 Information Retrieval Knowledge Distillation
— Unverified 0Student Network Learning via Evolutionary Knowledge Distillation Mar 23, 2021 Knowledge Distillation Transfer Learning
— Unverified 0Balanced softmax cross-entropy for incremental learning with and without memory Mar 23, 2021 class-incremental learning Class Incremental Learning
— Unverified 0Compacting Deep Neural Networks for Internet of Things: Methods and Applications Mar 20, 2021 Diversity Knowledge Distillation
— Unverified 0Online Lifelong Generalized Zero-Shot Learning Mar 19, 2021 Continual Learning Generalized Zero-Shot Learning
Code Code Available 0Variational Knowledge Distillation for Disease Classification in Chest X-Rays Mar 19, 2021 Classification General Classification
— Unverified 0Cost-effective Deployment of BERT Models in Serverless Environment Mar 19, 2021 Knowledge Distillation Semantic Textual Similarity
— Unverified 0Similarity Transfer for Knowledge Distillation Mar 18, 2021 Knowledge Distillation
— Unverified 0Transformer-based ASR Incorporating Time-reduction Layer and Fine-tuning with Self-Knowledge Distillation Mar 17, 2021 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Leveraging Recent Advances in Deep Learning for Audio-Visual Emotion Recognition Mar 16, 2021 Deep Learning Emotion Recognition
— Unverified 0Robustly Optimized and Distilled Training for Natural Language Understanding Mar 16, 2021 Knowledge Distillation Machine Reading Comprehension
— Unverified 0Robust Model Compression Using Deep Hypotheses Mar 13, 2021 Binary Classification Knowledge Distillation
Code Code Available 0A New Training Framework for Deep Neural Network Mar 12, 2021 Knowledge Distillation
— Unverified 0Semantic-aware Knowledge Distillation for Few-Shot Class-Incremental Learning Mar 6, 2021 class-incremental learning Class Incremental Learning
— Unverified 0Deep Neural Network Models Compression Mar 4, 2021 Knowledge Distillation Quantization
— Unverified 0Feature-Align Network with Knowledge Distillation for Efficient Denoising Mar 2, 2021 Decoder Denoising
— Unverified 0Embedded Knowledge Distillation in Depth-Level Dynamic Neural Network Mar 1, 2021 Dynamic neural networks Knowledge Distillation
— Unverified 0Alignment Knowledge Distillation for Online Streaming Attention-based Speech Recognition Feb 28, 2021 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0PURSUhInT: In Search of Informative Hint Points Based on Layer Clustering for Knowledge Distillation Feb 26, 2021 Clustering Knowledge Distillation
— Unverified 0Knowledge Distillation Circumvents Nonlinearity for Optical Convolutional Neural Networks Feb 26, 2021 Computational Efficiency Knowledge Distillation
— Unverified 0Enhancing Data-Free Adversarial Distillation with Activation Regularization and Virtual Interpolation Feb 23, 2021 Knowledge Distillation
— Unverified 0Multi-View Feature Representation for Dialogue Generation with Bidirectional Distillation Feb 22, 2021 Dialogue Generation General Knowledge
— Unverified 0Exploring Knowledge Distillation of a Deep Neural Network for Multi-Script identification Feb 20, 2021 Knowledge Distillation Transfer Learning
— Unverified 0Hierarchical Transformer-based Large-Context End-to-end ASR with Large-Context Knowledge Distillation Feb 16, 2021 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0End-to-End Automatic Speech Recognition with Deep Mutual Learning Feb 16, 2021 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0CAP-GAN: Towards Adversarial Robustness with Cycle-consistent Attentional Purification Feb 15, 2021 Adversarial Attack Adversarial Robustness
— Unverified 0Leveraging Acoustic and Linguistic Embeddings from Pretrained speech and language Models for Intent Classification Feb 15, 2021 Classification General Classification
— Unverified 0Improved Customer Transaction Classification using Semi-Supervised Knowledge Distillation Feb 15, 2021 Classification General Classification
— Unverified 0Self Regulated Learning Mechanism for Data Efficient Knowledge Distillation Feb 14, 2021 Knowledge Distillation Transfer Learning
— Unverified 0Semantically-Conditioned Negative Samples for Efficient Contrastive Learning Feb 12, 2021 Contrastive Learning Knowledge Distillation
— Unverified 0Learning Student-Friendly Teacher Networks for Knowledge Distillation Feb 12, 2021 Knowledge Distillation Transfer Learning
— Unverified 0NewsBERT: Distilling Pre-trained Language Model for Intelligent News Application Feb 9, 2021 Articles Knowledge Distillation
— Unverified 0Do Not Forget to Attend to Uncertainty while Mitigating Catastrophic Forgetting Feb 3, 2021 Deep Learning Incremental Learning
— Unverified 0Evolutionary Generative Adversarial Networks with Crossover Based Knowledge Distillation Jan 27, 2021 Knowledge Distillation
Code Code Available 0ISP Distillation Jan 25, 2021 Knowledge Distillation Object Recognition
— Unverified 0Network-Agnostic Knowledge Transfer for Medical Image Segmentation Jan 23, 2021 Image Segmentation Knowledge Distillation
— Unverified 0