Knowledge Inheritance for Pre-trained Language Models May 28, 2021 Domain Adaptation Knowledge Distillation
Code Code Available 1Selective Knowledge Distillation for Neural Machine Translation May 27, 2021 Knowledge Distillation Machine Translation
Code Code Available 1Honest-but-Curious Nets: Sensitive Attributes of Private Inputs Can Be Secretly Coded into the Classifiers' Outputs May 25, 2021 Attribute Knowledge Distillation
Code Code Available 1Backdoor Attacks on Self-Supervised Learning May 21, 2021 Backdoor Attack Inductive Bias
Code Code Available 1Intra-Document Cascading: Learning to Select Passages for Neural Document Ranking May 20, 2021 Document Ranking Knowledge Distillation
Code Code Available 1Data-Free Knowledge Distillation for Heterogeneous Federated Learning May 20, 2021 Data-free Knowledge Distillation Federated Learning
Code Code Available 1Comparing Kullback-Leibler Divergence and Mean Squared Error Loss in Knowledge Distillation May 19, 2021 Image Classification Knowledge Distillation
Code Code Available 1Contrastive Model Inversion for Data-Free Knowledge Distillation May 18, 2021 Contrastive Learning Data-free Knowledge Distillation
Code Code Available 1Graph-Free Knowledge Distillation for Graph Neural Networks May 16, 2021 Knowledge Distillation Transfer Learning
Code Code Available 1Undistillable: Making A Nasty Teacher That CANNOT teach students May 16, 2021 Knowledge Distillation
Code Code Available 1AgeFlow: Conditional Age Progression and Regression with Normalizing Flows May 15, 2021 Attribute Knowledge Distillation
Code Code Available 1Boosting Light-Weight Depth Estimation Via Knowledge Distillation May 13, 2021 Computational Efficiency Depth Estimation
Code Code Available 1When Human Pose Estimation Meets Robustness: Adversarial Algorithms and Benchmarks May 13, 2021 Knowledge Distillation Pose Estimation
Code Code Available 1MATE-KD: Masked Adversarial TExt, a Companion to Knowledge Distillation May 12, 2021 Adversarial Text Data Augmentation
Code Code Available 1Initialization and Regularization of Factorized Neural Layers May 3, 2021 Knowledge Distillation Model Compression
Code Code Available 1Open-vocabulary Object Detection via Vision and Language Knowledge Distillation Apr 28, 2021 image-classification Image Classification
Code Code Available 1Distilling Audio-Visual Knowledge by Compositional Contrastive Learning Apr 22, 2021 Audio Tagging audio-visual learning
Code Code Available 1Balanced Knowledge Distillation for Long-tailed Learning Apr 21, 2021 Knowledge Distillation
Code Code Available 1Voice2Mesh: Cross-Modal 3D Face Model Generation from Voices Apr 21, 2021 Face Generation Face Model
Code Code Available 1Distill on the Go: Online knowledge distillation in self-supervised learning Apr 20, 2021 Knowledge Distillation Self-Supervised Learning
Code Code Available 1Distilling Knowledge via Knowledge Review Apr 19, 2021 Instance Segmentation Knowledge Distillation
Code Code Available 1On Learning the Geodesic Path for Incremental Learning Apr 17, 2021 Incremental Learning Knowledge Distillation
Code Code Available 1Ego-Exo: Transferring Visual Representations from Third-person to First-person Videos Apr 16, 2021 Activity Recognition Diversity
Code Code Available 1Counter-Interference Adapter for Multilingual Machine Translation Apr 16, 2021 Knowledge Distillation Machine Translation
Code Code Available 1Incremental Multi-Target Domain Adaptation for Object Detection with Efficient Domain Transfer Apr 13, 2021 Domain Adaptation Incremental Learning
Code Code Available 1Class-Balanced Distillation for Long-Tailed Visual Recognition Apr 12, 2021 Image Classification Knowledge Distillation
Code Code Available 1Content-Aware GAN Compression Apr 6, 2021 Image Generation Image Manipulation
Code Code Available 1HAD-Net: A Hierarchical Adversarial Knowledge Distillation Network for Improved Enhanced Tumour Segmentation Without Post-Contrast Images Mar 30, 2021 Knowledge Distillation Segmentation
Code Code Available 1Complementary Relation Contrastive Distillation Mar 29, 2021 Knowledge Distillation Relation
Code Code Available 1Embedding Transfer with Label Relaxation for Improved Metric Learning Mar 27, 2021 Knowledge Distillation Metric Learning
Code Code Available 1Multimodal Knowledge Expansion Mar 26, 2021 Denoising Knowledge Distillation
Code Code Available 1Distilling Object Detectors via Decoupled Features Mar 26, 2021 image-classification Image Classification
Code Code Available 1Distilling a Powerful Student Model via Online Knowledge Distillation Mar 26, 2021 Knowledge Distillation
Code Code Available 1Pruning-then-Expanding Model for Domain Adaptation of Neural Machine Translation Mar 25, 2021 Domain Adaptation Knowledge Distillation
Code Code Available 1ROSITA: Refined BERT cOmpreSsion with InTegrAted techniques Mar 21, 2021 Knowledge Distillation
Code Code Available 1Self-Supervised Adaptation for Video Super-Resolution Mar 18, 2021 Image Super-Resolution Knowledge Distillation
Code Code Available 1Human-Inspired Multi-Agent Navigation using Knowledge Distillation Mar 18, 2021 Collision Avoidance Knowledge Distillation
Code Code Available 1Refine Myself by Teaching Myself: Feature Refinement via Self-Knowledge Distillation Mar 15, 2021 Data Augmentation Knowledge Distillation
Code Code Available 1Beyond Self-Supervision: A Simple Yet Effective Network Distillation Alternative to Improve Backbones Mar 10, 2021 Knowledge Distillation object-detection
Code Code Available 1Parser-Free Virtual Try-on via Distilling Appearance Flows Mar 8, 2021 Human Parsing Knowledge Distillation
Code Code Available 1Adaptive Multi-Teacher Multi-level Knowledge Distillation Mar 6, 2021 Knowledge Distillation
Code Code Available 1Distributed Dynamic Map Fusion via Federated Learning for Intelligent Networked Vehicles Mar 5, 2021 Federated Learning Knowledge Distillation
Code Code Available 1Teachers Do More Than Teach: Compressing Image-to-Image Models Mar 5, 2021 Knowledge Distillation
Code Code Available 1Extract the Knowledge of Graph Neural Networks and Go Beyond it: An Effective Knowledge Distillation Framework Mar 4, 2021 Knowledge Distillation Node Classification
Code Code Available 1General Instance Distillation for Object Detection Mar 3, 2021 Knowledge Distillation Model Compression
Code Code Available 1Exploring Complementary Strengths of Invariant and Equivariant Representations for Few-Shot Learning Mar 1, 2021 Few-Shot Image Classification Few-Shot Learning
Code Code Available 1Distilling Knowledge via Intermediate Classifiers Feb 28, 2021 Knowledge Distillation Transfer Learning
Code Code Available 1Training Generative Adversarial Networks in One Stage Feb 28, 2021 Data-free Knowledge Distillation Image Generation
Code Code Available 1Even your Teacher Needs Guidance: Ground-Truth Targets Dampen Regularization Imposed by Self-Distillation Feb 25, 2021 Knowledge Distillation Self-Knowledge Distillation
Code Code Available 1Localization Distillation for Dense Object Detection Feb 24, 2021 Dense Object Detection Knowledge Distillation
Code Code Available 1