Continual Learning for Class- and Domain-Incremental Semantic Segmentation Sep 16, 2022 class-incremental learning Class Incremental Learning
— Unverified 0CES-KD: Curriculum-based Expert Selection for Guided Knowledge Distillation Sep 15, 2022 Knowledge Distillation
— Unverified 0Layerwise Bregman Representation Learning with Applications to Knowledge Distillation Sep 15, 2022 Knowledge Distillation Representation Learning
— Unverified 0PlaStIL: Plastic and Stable Memory-Free Class-Incremental Learning Sep 14, 2022 class-incremental learning Class Incremental Learning
— Unverified 0TASKED: Transformer-based Adversarial learning for human activity recognition using wearable sensors via Self-KnowledgE Distillation Sep 14, 2022 Activity Recognition Human Activity Recognition
— Unverified 0Multi-stage Distillation Framework for Cross-Lingual Semantic Similarity Matching Sep 13, 2022 Contrastive Learning Knowledge Distillation
Code Code Available 0Online Continual Learning via the Meta-learning Update with Multi-scale Knowledge Distillation and Data Augmentation Sep 12, 2022 Continual Learning Data Augmentation
— Unverified 0Preserving Privacy in Federated Learning with Ensemble Cross-Domain Knowledge Distillation Sep 10, 2022 Federated Learning image-classification
— Unverified 0Selecting Related Knowledge via Efficient Channel Attention for Online Continual Learning Sep 9, 2022 Continual Learning Knowledge Distillation
— Unverified 0In-situ animal behavior classification using knowledge distillation and fixed-point quantization Sep 9, 2022 Classification Knowledge Distillation
— Unverified 0Exploring Target Representations for Masked Autoencoders Sep 8, 2022 Image Classification Instance Segmentation
Code Code Available 0ViTKD: Practical Guidelines for ViT feature knowledge distillation Sep 6, 2022 Image Classification Knowledge Distillation
— Unverified 0A Novel Self-Knowledge Distillation Approach with Siamese Representation Learning for Action Recognition Sep 3, 2022 Action Recognition Knowledge Distillation
— Unverified 0Knowledge Distillation for Sustainable Neural Machine Translation Sep 1, 2022 Knowledge Distillation Machine Translation
— Unverified 0Dynamics-Adaptive Continual Reinforcement Learning via Progressive Contextualization Sep 1, 2022 Bayesian Inference Knowledge Distillation
— Unverified 0FAKD: Feature Augmented Knowledge Distillation for Semantic Segmentation Aug 30, 2022 Knowledge Distillation Segmentation
Code Code Available 0Dynamic Data-Free Knowledge Distillation by Easy-to-Hard Learning Strategy Aug 29, 2022 Data-free Knowledge Distillation Knowledge Distillation
Code Code Available 0Removing Rain Streaks via Task Transfer Learning Aug 28, 2022 Knowledge Distillation Rain Removal
— Unverified 0Goal-Conditioned Q-Learning as Knowledge Distillation Aug 28, 2022 Knowledge Distillation Q-Learning
Code Code Available 0Unsupervised Spike Depth Estimation via Cross-modality Cross-domain Knowledge Transfer Aug 26, 2022 Autonomous Driving Depth Estimation
Code Code Available 0Dense Depth Distillation with Out-of-Distribution Simulated Images Aug 26, 2022 Data-free Knowledge Distillation Depth Estimation
— Unverified 0Debias the Black-box: A Fair Ranking Framework via Knowledge Distillation Aug 24, 2022 Fairness Information Retrieval
— Unverified 0Lifelong Learning for Neural powered Mixed Integer Programming Aug 24, 2022 Graph Attention Knowledge Distillation
— Unverified 0FS-BAN: Born-Again Networks for Domain Generalization Few-Shot Classification Aug 23, 2022 Domain Generalization Knowledge Distillation
Code Code Available 0Multi-View Attention Transfer for Efficient Speech Enhancement Aug 22, 2022 Knowledge Distillation Speech Enhancement
— Unverified 0Rethinking Knowledge Distillation via Cross-Entropy Aug 22, 2022 Knowledge Distillation
— Unverified 0RAIN: RegulArization on Input and Network for Black-Box Domain Adaptation Aug 22, 2022 Data Augmentation Domain Adaptation
Code Code Available 0Combining Compressions for Multiplicative Size Scaling on Natural Language Tasks Aug 20, 2022 Knowledge Distillation Neural Network Compression
— Unverified 0Effectiveness of Function Matching in Driving Scene Recognition Aug 20, 2022 Autonomous Driving image-classification
— Unverified 0Quantifying the Knowledge in a DNN to Explain Knowledge Distillation for Classification Aug 18, 2022 3D Point Cloud Classification Classification
— Unverified 0Leukocyte Classification using Multimodal Architecture Enhanced by Knowledge Distillation Aug 17, 2022 Classification Knowledge Distillation
— Unverified 0Progressive Cross-modal Knowledge Distillation for Human Action Recognition Aug 17, 2022 Action Recognition Knowledge Distillation
— Unverified 0Unsupervised Domain Adaptation for Segmentation with Black-box Source Model Aug 16, 2022 Domain Adaptation Knowledge Distillation
— Unverified 0RAWtoBit: A Fully End-to-end Camera ISP Network Aug 16, 2022 Image Compression Knowledge Distillation
— Unverified 0Enhancing Heterogeneous Federated Learning with Knowledge Extraction and Multi-Model Fusion Aug 16, 2022 Federated Learning Knowledge Distillation
Code Code Available 0A Knowledge Distillation-Based Backdoor Attack in Federated Learning Aug 12, 2022 Backdoor Attack Federated Learning
— Unverified 0Non-Autoregressive Sign Language Production via Knowledge Distillation Aug 12, 2022 Knowledge Distillation Sign Language Production
— Unverified 0BEiT v2: Masked Image Modeling with Vector-Quantized Visual Tokenizers Aug 12, 2022 image-classification Image Classification
Code Code Available 0Self-Knowledge Distillation via Dropout Aug 11, 2022 Adversarial Robustness image-classification
— Unverified 0SKDCGN: Source-free Knowledge Distillation of Counterfactual Generative Networks using cGANs Aug 8, 2022 counterfactual Knowledge Distillation
Code Code Available 0Label Semantic Knowledge Distillation for Unbiased Scene Graph Generation Aug 7, 2022 Graph Generation Knowledge Distillation
— Unverified 0Study of Encoder-Decoder Architectures for Code-Mix Search Query Translation Aug 7, 2022 Data Augmentation Decoder
— Unverified 0PGX: A Multi-level GNN Explanation Framework Based on Separate Knowledge Distillation Processes Aug 5, 2022 Knowledge Distillation Representation Learning
— Unverified 0Task-Balanced Distillation for Object Detection Aug 5, 2022 Classification Knowledge Distillation
— Unverified 0Deep Semi-Supervised and Self-Supervised Learning for Diabetic Retinopathy Detection Aug 4, 2022 Diabetic Retinopathy Detection Knowledge Distillation
— Unverified 0Pose Uncertainty Aware Movement Synchrony Estimation via Spatial-Temporal Graph Transformer Aug 1, 2022 Activity Recognition Contrastive Learning
— Unverified 0SDBERT: SparseDistilBERT, a faster and smaller BERT model Jul 28, 2022 Knowledge Distillation
— Unverified 0NICEST: Noisy Label Correction and Training for Robust Scene Graph Generation Jul 27, 2022 Graph Generation Knowledge Distillation
— Unverified 0Exploring Generalizable Distillation for Efficient Medical Image Segmentation Jul 26, 2022 Decoder Image Segmentation
Code Code Available 0Few-Shot Object Detection by Knowledge Distillation Using Bag-of-Visual-Words Representations Jul 25, 2022 Few-Shot Object Detection Knowledge Distillation
— Unverified 0