Fair Feature Distillation for Visual Recognition May 27, 2021 Fairness Knowledge Distillation
— Unverified 0How Does Distilled Data Complexity Impact the Quality and Confidence of Non-Autoregressive Machine Translation? May 27, 2021 Diversity Knowledge Distillation
— Unverified 0KnowSR: Knowledge Sharing among Homogeneous Agents in Multi-agent Reinforcement Learning May 25, 2021 Deep Reinforcement Learning Knowledge Distillation
— Unverified 0Real-time Monocular Depth Estimation with Sparse Supervision on Mobile May 25, 2021 Autonomous Vehicles Depth Estimation
— Unverified 0Experimenting with Knowledge Distillation techniques for performing Brain Tumor Segmentation May 24, 2021 Brain Tumor Segmentation Knowledge Distillation
— Unverified 0AirNet: Neural Network Transmission over the Air May 24, 2021 Knowledge Distillation
— Unverified 0Revisiting Knowledge Distillation for Object Detection May 22, 2021 Domain Adaptation Knowledge Distillation
— Unverified 0Inplace knowledge distillation with teacher assistant for improved training of flexible deep neural networks May 18, 2021 image-classification Image Classification
— Unverified 0Weakly Supervised Dense Video Captioning via Jointly Usage of Knowledge Distillation and Cross-modal Matching May 18, 2021 Caption Generation Cross-Modal Retrieval
— Unverified 0Class-Incremental Few-Shot Object Detection May 17, 2021 Clustering Few-Shot Object Detection
— Unverified 0Stacked Acoustic-and-Textual Encoding: Integrating the Pre-trained Models into Speech Translation Encoders May 12, 2021 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0KDExplainer: A Task-oriented Attention Model for Explaining Knowledge Distillation May 10, 2021 Knowledge Distillation Mixture-of-Experts
— Unverified 0Test-Time Adaptation Toward Personalized Speech Enhancement: Zero-Shot Learning with Knowledge Distillation May 8, 2021 Denoising Knowledge Distillation
— Unverified 0Regression Bugs Are In Your Model! Measuring, Reducing and Analyzing Regressions In NLP Model Updates May 7, 2021 Knowledge Distillation model
— Unverified 0Black-Box Dissector: Towards Erasing-based Hard-Label Model Stealing Attack May 3, 2021 Knowledge Distillation Self-Knowledge Distillation
— Unverified 0A Peek Into the Reasoning of Neural Networks: Interpreting with Structural Visual Concepts May 1, 2021 Explainable artificial intelligence Knowledge Distillation
— Unverified 0Knowledge Distillation for Swedish NER models: A Search for Performance and Efficiency May 1, 2021 Knowledge Distillation Model Compression
— Unverified 0Contrastive Conditioning for Assessing Disambiguation in MT: A Case Study of Distilled Bias May 1, 2021 Knowledge Distillation Machine Translation
Code Code Available 0Semantic Relation Preserving Knowledge Distillation for Image-to-Image Translation Apr 30, 2021 Image-to-Image Translation Knowledge Distillation
— Unverified 0Distilling EEG Representations via Capsules for Affective Computing Apr 30, 2021 EEG Electroencephalogram (EEG)
— Unverified 0LIDAR and Position-Aided mmWave Beam Selection with Non-local CNNs and Curriculum Training Apr 29, 2021 Knowledge Distillation Position
Code Code Available 0Spirit Distillation: A Model Compression Method with Multi-domain Knowledge Transfer Apr 29, 2021 General Knowledge Knowledge Distillation
— Unverified 0Interpretable Embedding Procedure Knowledge Transfer via Stacked Principal Component Analysis and Graph Neural Network Apr 28, 2021 Graph Neural Network Knowledge Distillation
Code Code Available 0Self-distillation with Batch Knowledge Ensembling Improves ImageNet Classification Apr 27, 2021 Classification General Classification
— Unverified 0Extract then Distill: Efficient and Effective Task-Agnostic BERT Distillation Apr 24, 2021 Knowledge Distillation
— Unverified 0Relational Subsets Knowledge Distillation for Long-tailed Retinal Diseases Recognition Apr 22, 2021 Knowledge Distillation
— Unverified 0Orderly Dual-Teacher Knowledge Distillation for Lightweight Human Pose Estimation Apr 21, 2021 Binarization Knowledge Distillation
— Unverified 0Brittle Features May Help Anomaly Detection Apr 21, 2021 Anomaly Detection Knowledge Distillation
— Unverified 0Knowledge Distillation as Semiparametric Inference Apr 20, 2021 Knowledge Distillation Model Compression
Code Code Available 0EduPal leaves no professor behind: Supporting faculty via a peer-powered recommender system Apr 20, 2021 Chatbot Knowledge Distillation
— Unverified 0Compact CNN Structure Learning by Knowledge Distillation Apr 19, 2021 Knowledge Distillation Model Compression
— Unverified 0Continual Learning for Fake Audio Detection Apr 15, 2021 Continual Learning Knowledge Distillation
— Unverified 0Integration of Pre-trained Networks with Continuous Token Interface for End-to-End Spoken Language Understanding Apr 15, 2021 intent-classification Intent Classification
— Unverified 0Unsupervised Continual Learning Via Pseudo Labels Apr 14, 2021 Clustering Continual Learning
— Unverified 0The Curious Case of Hallucinations in Neural Machine Translation Apr 14, 2021 Hallucination Knowledge Distillation
Code Code Available 0Sentence Embeddings by Ensemble Distillation Apr 14, 2021 Knowledge Distillation Semantic Textual Similarity
— Unverified 0Annealing Knowledge Distillation Apr 14, 2021 image-classification Image Classification
Code Code Available 0Dealing with Missing Modalities in the Visual Question Answer-Difference Prediction Task through Knowledge Distillation Apr 13, 2021 Knowledge Distillation Triplet
— Unverified 0Source and Target Bidirectional Knowledge Distillation for End-to-end Speech Translation Apr 13, 2021 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0RankDistil: Knowledge Distillation for Ranking Apr 13, 2021 Document Ranking Knowledge Distillation
— Unverified 0CXR Segmentation by AdaIN-based Domain Adaptation and Knowledge Distillation Apr 13, 2021 Domain Adaptation Knowledge Distillation
Code Code Available 0Dual Discriminator Adversarial Distillation for Data-free Model Compression Apr 12, 2021 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 0Data-Free Knowledge Distillation with Soft Targeted Transfer Set Synthesis Apr 10, 2021 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 0Towards Enabling Meta-Learning from Target Models Apr 8, 2021 Few-Shot Learning Inductive Bias
Code Code Available 0GKD: Semi-supervised Graph Knowledge Distillation for Graph-Independent Inference Apr 8, 2021 Disease Prediction graph construction
Code Code Available 0Distilling and Transferring Knowledge via cGAN-generated Samples for Image Classification and Regression Apr 7, 2021 General Classification image-classification
Code Code Available 0Compressing Visual-linguistic Model via Knowledge Distillation Apr 5, 2021 Image Captioning Knowledge Distillation
— Unverified 0Knowledge Distillation For Wireless Edge Learning Apr 3, 2021 Cloud Computing Federated Learning
Code Code Available 0Students are the Best Teacher: Exit-Ensemble Distillation with Multi-Exits Apr 1, 2021 Classification General Classification
Code Code Available 0Dialect Identification through Adversarial Learning and Knowledge Distillation on Romanian BERT Apr 1, 2021 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0