Contrastive Model Inversion for Data-Free Knowledge Distillation May 18, 2021 Contrastive Learning Data-free Knowledge Distillation
Code Code Available 1Class-Incremental Few-Shot Object Detection May 17, 2021 Clustering Few-Shot Object Detection
— Unverified 0Undistillable: Making A Nasty Teacher That CANNOT teach students May 16, 2021 Knowledge Distillation
Code Code Available 1Graph-Free Knowledge Distillation for Graph Neural Networks May 16, 2021 Knowledge Distillation Transfer Learning
Code Code Available 1AgeFlow: Conditional Age Progression and Regression with Normalizing Flows May 15, 2021 Attribute Knowledge Distillation
Code Code Available 1Boosting Light-Weight Depth Estimation Via Knowledge Distillation May 13, 2021 Computational Efficiency Depth Estimation
Code Code Available 1When Human Pose Estimation Meets Robustness: Adversarial Algorithms and Benchmarks May 13, 2021 Knowledge Distillation Pose Estimation
Code Code Available 1MATE-KD: Masked Adversarial TExt, a Companion to Knowledge Distillation May 12, 2021 Adversarial Text Data Augmentation
Code Code Available 1Stacked Acoustic-and-Textual Encoding: Integrating the Pre-trained Models into Speech Translation Encoders May 12, 2021 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0KDExplainer: A Task-oriented Attention Model for Explaining Knowledge Distillation May 10, 2021 Knowledge Distillation Mixture-of-Experts
— Unverified 0Test-Time Adaptation Toward Personalized Speech Enhancement: Zero-Shot Learning with Knowledge Distillation May 8, 2021 Denoising Knowledge Distillation
— Unverified 0Regression Bugs Are In Your Model! Measuring, Reducing and Analyzing Regressions In NLP Model Updates May 7, 2021 Knowledge Distillation model
— Unverified 0Initialization and Regularization of Factorized Neural Layers May 3, 2021 Knowledge Distillation Model Compression
Code Code Available 1Black-Box Dissector: Towards Erasing-based Hard-Label Model Stealing Attack May 3, 2021 Knowledge Distillation Self-Knowledge Distillation
— Unverified 0Contrastive Conditioning for Assessing Disambiguation in MT: A Case Study of Distilled Bias May 1, 2021 Knowledge Distillation Machine Translation
Code Code Available 0Knowledge Distillation for Swedish NER models: A Search for Performance and Efficiency May 1, 2021 Knowledge Distillation Model Compression
— Unverified 0A Peek Into the Reasoning of Neural Networks: Interpreting with Structural Visual Concepts May 1, 2021 Explainable artificial intelligence Knowledge Distillation
— Unverified 0Distilling EEG Representations via Capsules for Affective Computing Apr 30, 2021 EEG Electroencephalogram (EEG)
— Unverified 0Semantic Relation Preserving Knowledge Distillation for Image-to-Image Translation Apr 30, 2021 Image-to-Image Translation Knowledge Distillation
— Unverified 0Spirit Distillation: A Model Compression Method with Multi-domain Knowledge Transfer Apr 29, 2021 General Knowledge Knowledge Distillation
— Unverified 0LIDAR and Position-Aided mmWave Beam Selection with Non-local CNNs and Curriculum Training Apr 29, 2021 Knowledge Distillation Position
Code Code Available 0Interpretable Embedding Procedure Knowledge Transfer via Stacked Principal Component Analysis and Graph Neural Network Apr 28, 2021 Graph Neural Network Knowledge Distillation
Code Code Available 0Open-vocabulary Object Detection via Vision and Language Knowledge Distillation Apr 28, 2021 image-classification Image Classification
Code Code Available 1Self-distillation with Batch Knowledge Ensembling Improves ImageNet Classification Apr 27, 2021 Classification General Classification
— Unverified 0Extract then Distill: Efficient and Effective Task-Agnostic BERT Distillation Apr 24, 2021 Knowledge Distillation
— Unverified 0Distilling Audio-Visual Knowledge by Compositional Contrastive Learning Apr 22, 2021 Audio Tagging audio-visual learning
Code Code Available 1Relational Subsets Knowledge Distillation for Long-tailed Retinal Diseases Recognition Apr 22, 2021 Knowledge Distillation
— Unverified 0Voice2Mesh: Cross-Modal 3D Face Model Generation from Voices Apr 21, 2021 Face Generation Face Model
Code Code Available 1Brittle Features May Help Anomaly Detection Apr 21, 2021 Anomaly Detection Knowledge Distillation
— Unverified 0Orderly Dual-Teacher Knowledge Distillation for Lightweight Human Pose Estimation Apr 21, 2021 Binarization Knowledge Distillation
— Unverified 0Balanced Knowledge Distillation for Long-tailed Learning Apr 21, 2021 Knowledge Distillation
Code Code Available 1EduPal leaves no professor behind: Supporting faculty via a peer-powered recommender system Apr 20, 2021 Chatbot Knowledge Distillation
— Unverified 0Distill on the Go: Online knowledge distillation in self-supervised learning Apr 20, 2021 Knowledge Distillation Self-Supervised Learning
Code Code Available 1Knowledge Distillation as Semiparametric Inference Apr 20, 2021 Knowledge Distillation Model Compression
Code Code Available 0Compact CNN Structure Learning by Knowledge Distillation Apr 19, 2021 Knowledge Distillation Model Compression
— Unverified 0Distilling Knowledge via Knowledge Review Apr 19, 2021 Instance Segmentation Knowledge Distillation
Code Code Available 1On Learning the Geodesic Path for Incremental Learning Apr 17, 2021 Incremental Learning Knowledge Distillation
Code Code Available 1Ego-Exo: Transferring Visual Representations from Third-person to First-person Videos Apr 16, 2021 Activity Recognition Diversity
Code Code Available 1Counter-Interference Adapter for Multilingual Machine Translation Apr 16, 2021 Knowledge Distillation Machine Translation
Code Code Available 1Continual Learning for Fake Audio Detection Apr 15, 2021 Continual Learning Knowledge Distillation
— Unverified 0Integration of Pre-trained Networks with Continuous Token Interface for End-to-End Spoken Language Understanding Apr 15, 2021 intent-classification Intent Classification
— Unverified 0Unsupervised Continual Learning Via Pseudo Labels Apr 14, 2021 Clustering Continual Learning
— Unverified 0Annealing Knowledge Distillation Apr 14, 2021 image-classification Image Classification
Code Code Available 0Sentence Embeddings by Ensemble Distillation Apr 14, 2021 Knowledge Distillation Semantic Textual Similarity
— Unverified 0The Curious Case of Hallucinations in Neural Machine Translation Apr 14, 2021 Hallucination Knowledge Distillation
Code Code Available 0RankDistil: Knowledge Distillation for Ranking Apr 13, 2021 Document Ranking Knowledge Distillation
— Unverified 0Incremental Multi-Target Domain Adaptation for Object Detection with Efficient Domain Transfer Apr 13, 2021 Domain Adaptation Incremental Learning
Code Code Available 1CXR Segmentation by AdaIN-based Domain Adaptation and Knowledge Distillation Apr 13, 2021 Domain Adaptation Knowledge Distillation
Code Code Available 0Dealing with Missing Modalities in the Visual Question Answer-Difference Prediction Task through Knowledge Distillation Apr 13, 2021 Knowledge Distillation Triplet
— Unverified 0Source and Target Bidirectional Knowledge Distillation for End-to-end Speech Translation Apr 13, 2021 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0