Technical Report for ICCV 2021 Challenge SSLAD-Track3B: Transformers Are Better Continual Learners Jan 13, 2022 Continual Learning Knowledge Distillation
— Unverified 0On Exploring Pose Estimation as an Auxiliary Learning Task for Visible-Infrared Person Re-identification Jan 11, 2022 Auxiliary Learning Knowledge Distillation
Code Code Available 0MobileFaceSwap: A Lightweight Framework for Video Face Swapping Jan 11, 2022 Face Swapping Knowledge Distillation
Code Code Available 2FedDTG:Federated Data-Free Knowledge Distillation via Three-Player Generative Adversarial Networks Jan 10, 2022 Data-free Knowledge Distillation Federated Learning
— Unverified 0Robust and Resource-Efficient Data-Free Knowledge Distillation by Generative Pseudo Replay Jan 9, 2022 Data-free Knowledge Distillation image-classification
Code Code Available 1Two-Pass End-to-End ASR Model Compression Jan 8, 2022 Decoder Knowledge Distillation
— Unverified 0Microdosing: Knowledge Distillation for GAN based Compression Jan 7, 2022 Knowledge Distillation Video Compression
— Unverified 0Which Student is Best? A Comprehensive Knowledge Distillation Exam for Task-Specific BERT Models Jan 3, 2022 CPU Data Augmentation
— Unverified 0Class-Incremental Continual Learning into the eXtended DER-verse Jan 3, 2022 Continual Learning Knowledge Distillation
— Unverified 0Class Similarity Weighted Knowledge Distillation for Continual Semantic Segmentation Jan 1, 2022 Continual Learning Continual Semantic Segmentation
— Unverified 0Multi-Objective Diverse Human Motion Prediction With Knowledge Distillation Jan 1, 2022 Autonomous Driving Diversity
— Unverified 0Learn From Others and Be Yourself in Heterogeneous Federated Learning Jan 1, 2022 Continual Learning Federated Learning
Code Code Available 1Performance-Aware Mutual Knowledge Distillation for Improving Neural Architecture Search Jan 1, 2022 Knowledge Distillation Neural Architecture Search
— Unverified 0Improving Video Model Transfer With Dynamic Representation Learning Jan 1, 2022 Action Classification Knowledge Distillation
— Unverified 0Distillation Using Oracle Queries for Transformer-Based Human-Object Interaction Detection Jan 1, 2022 Data Augmentation Decoder
— Unverified 0Image Restoration using Feature-guidance Jan 1, 2022 Image Restoration Knowledge Distillation
— Unverified 0Role of Data Augmentation Strategies in Knowledge Distillation for Wearable Sensor Data Jan 1, 2022 Data Augmentation Knowledge Distillation
Code Code Available 1Conditional Generative Data-free Knowledge Distillation Dec 31, 2021 Conditional Image Generation Data-free Knowledge Distillation
— Unverified 0Data-Free Knowledge Transfer: A Survey Dec 31, 2021 Data-free Knowledge Distillation Domain Adaptation
— Unverified 0Confidence-Aware Multi-Teacher Knowledge Distillation Dec 30, 2021 Knowledge Distillation Transfer Learning
Code Code Available 1An Efficient Federated Distillation Learning System for Multi-task Time Series Classification Dec 30, 2021 Knowledge Distillation Time Series
— Unverified 0Automatic Mixed-Precision Quantization Search of BERT Dec 30, 2021 Knowledge Distillation Model Compression
— Unverified 0Online Adversarial Knowledge Distillation for Graph Neural Networks Dec 28, 2021 Knowledge Distillation
Code Code Available 0Distilling the Knowledge of Romanian BERTs Using Multiple Teachers Dec 23, 2021 Dialect Identification GPU
Code Code Available 0Adaptive Beam Search to Enhance On-device Abstractive Summarization Dec 22, 2021 Abstractive Text Summarization Knowledge Distillation
— Unverified 0Self-Distillation Mixup Training for Non-autoregressive Neural Machine Translation Dec 22, 2021 Knowledge Distillation Machine Translation
— Unverified 0Multi-Modality Distillation via Learning the teacher's modality-level Gram Matrix Dec 21, 2021 Knowledge Distillation
— Unverified 0Supervised Graph Contrastive Pretraining for Text Classification Dec 21, 2021 Classification Contrastive Learning
— Unverified 0Deep Graph-level Anomaly Detection by Glocal Knowledge Distillation Dec 19, 2021 Anomaly Detection Knowledge Distillation
Code Code Available 1Controlling the Quality of Distillation in Response-Based Network Compression Dec 19, 2021 Knowledge Distillation
— Unverified 0LegoDNN: Block-grained Scaling of Deep Neural Networks for Mobile Vision Dec 18, 2021 Knowledge Distillation Model Compression
— Unverified 0Distill and De-bias: Mitigating Bias in Face Verification using Knowledge Distillation Dec 17, 2021 Attribute Face Recognition
— Unverified 0Knowledge Distillation Improves Stability in Retranslation-based Simultaneous Translation Dec 17, 2021 Knowledge Distillation Translation
— Unverified 0Towards Disturbance-Free Visual Mobile Manipulation Dec 17, 2021 Collision Avoidance Deep Reinforcement Learning
Code Code Available 0Pixel Distillation: A New Knowledge Distillation Scheme for Low-Resolution Image Recognition Dec 17, 2021 image-classification Image Classification
Code Code Available 1Data Efficient Language-supervised Zero-shot Recognition with Optimal Transport Distillation Dec 17, 2021 Contrastive Learning Knowledge Distillation
Code Code Available 1Distillation of Human-Object Interaction Contexts for Action Recognition Dec 17, 2021 Action Recognition Graph Attention
— Unverified 0Weakly Supervised Semantic Segmentation via Alternative Self-Dual Teaching Dec 17, 2021 Knowledge Distillation Semantic Segmentation
— Unverified 0Amortized Noisy Channel Neural Machine Translation Dec 16, 2021 Imitation Learning Knowledge Distillation
— Unverified 0Learning Cross-Lingual IR from an English Retriever Dec 15, 2021 Cross-Lingual Information Retrieval Information Retrieval
Code Code Available 1On the Use of External Data for Spoken Named Entity Recognition Dec 14, 2021 Knowledge Distillation named-entity-recognition
Code Code Available 0Towards a Unified Foundation Model: Jointly Pre-Training Transformers on Unpaired Images and Text Dec 14, 2021 image-classification Image Classification
— Unverified 0A Deep Knowledge Distillation framework for EEG assisted enhancement of single-lead ECG based sleep staging Dec 14, 2021 ECG based Sleep Staging EEG
Code Code Available 1Lifelong Unsupervised Domain Adaptive Person Re-identification with Coordinated Anti-forgetting and Adaptation Dec 13, 2021 Domain Adaptive Person Re-Identification Knowledge Distillation
— Unverified 0Improving Sequential Recommendations via Bidirectional Temporal Data Augmentation with Pre-training Dec 13, 2021 Data Augmentation Knowledge Distillation
Code Code Available 0Up to 100 Faster Data-free Knowledge Distillation Dec 12, 2021 Data-free Knowledge Distillation Knowledge Distillation
Code Code Available 1DistilCSE: Effective Knowledge Distillation For Contrastive Sentence Embeddings Dec 10, 2021 Contrastive Learning Knowledge Distillation
Code Code Available 1Human Guided Exploitation of Interpretable Attention Patterns in Summarization and Topic Segmentation Dec 10, 2021 Extractive Summarization Knowledge Distillation
Code Code Available 0Mask-invariant Face Recognition through Template-level Knowledge Distillation Dec 10, 2021 Face Recognition Knowledge Distillation
Code Code Available 1Mutual Adversarial Training: Learning together is better than going alone Dec 9, 2021 Knowledge Distillation
— Unverified 0