SMOC-Net: Leveraging Camera Pose for Self-Supervised Monocular Object Pose Estimation Jan 1, 2023 6D Pose Estimation using RGB Knowledge Distillation
— Unverified 0Distilling Cross-Temporal Contexts for Continuous Sign Language Recognition Jan 1, 2023 Knowledge Distillation Sign Language Recognition
— Unverified 0X3KD: Knowledge Distillation Across Modalities, Tasks and Stages for Multi-Camera 3D Object Detection Jan 1, 2023 3D Object Detection Instance Segmentation
— Unverified 0Bilateral Memory Consolidation for Continual Learning Jan 1, 2023 Continual Learning Knowledge Distillation
— Unverified 0Incrementer: Transformer for Class-Incremental Semantic Segmentation With Knowledge Distillation Focusing on Old Class Jan 1, 2023 Class-Incremental Semantic Segmentation Decoder
— Unverified 0Active Exploration of Multimodal Complementarity for Few-Shot Action Recognition Jan 1, 2023 Action Recognition Few-Shot action recognition
— Unverified 0Revisiting Prototypical Network for Cross Domain Few-Shot Learning Jan 1, 2023 Cross-Domain Few-Shot cross-domain few-shot learning
Code Code Available 1Few-Shot Class-Incremental Learning via Class-Aware Bilateral Distillation Jan 1, 2023 class-incremental learning Class Incremental Learning
Code Code Available 1CaPriDe Learning: Confidential and Private Decentralized Learning Based on Encryption-Friendly Distillation Loss Jan 1, 2023 Federated Learning Knowledge Distillation
Code Code Available 0You Do Not Need Additional Priors or Regularizers in Retinex-Based Low-Light Image Enhancement Jan 1, 2023 Contrastive Learning Image Enhancement
— Unverified 0Open-Set Fine-Grained Retrieval via Prompting Vision-Language Evaluator Jan 1, 2023 Knowledge Distillation Retrieval
— Unverified 0MEDIC: Remove Model Backdoors via Importance Driven Cloning Jan 1, 2023 Knowledge Distillation model
— Unverified 0Rethinking Feature-Based Knowledge Distillation for Face Recognition Jan 1, 2023 Face Recognition GPU
— Unverified 0DaFKD: Domain-Aware Federated Knowledge Distillation Jan 1, 2023 Knowledge Distillation
— Unverified 0CLIPPING: Distilling CLIP-Based Models With a Student Base for Video-Language Retrieval Jan 1, 2023 Knowledge Distillation Language Modelling
— Unverified 0Distilling Focal Knowledge From Imperfect Expert for 3D Object Detection Jan 1, 2023 3D geometry 3D Object Detection
Code Code Available 0Endpoints Weight Fusion for Class Incremental Semantic Segmentation Jan 1, 2023 class-incremental learning Class Incremental Learning
— Unverified 0Boosting Accuracy and Robustness of Student Models via Adaptive Adversarial Distillation Jan 1, 2023 Adversarial Robustness Knowledge Distillation
— Unverified 0FedICT: Federated Multi-task Distillation for Multi-access Edge Computing Jan 1, 2023 Edge-computing Federated Learning
Code Code Available 0Discriminator-Cooperated Feature Map Distillation for GAN Compression Dec 29, 2022 Image Generation Knowledge Distillation
Code Code Available 1Resolving Task Confusion in Dynamic Expansion Architectures for Class Incremental Learning Dec 29, 2022 class-incremental learning Class Incremental Learning
Code Code Available 1A Unified Object Counting Network with Object Occupation Prior Dec 29, 2022 Crowd Counting Knowledge Distillation
Code Code Available 0NeRN -- Learning Neural Representations for Neural Networks Dec 27, 2022 Knowledge Distillation
Code Code Available 1Prototype-guided Cross-task Knowledge Distillation for Large-scale Models Dec 26, 2022 Knowledge Distillation
Code Code Available 0BD-KD: Balancing the Divergences for Online Knowledge Distillation Dec 25, 2022 Knowledge Distillation Model Compression
— Unverified 0CAMeMBERT: Cascading Assistant-Mediated Multilingual BERT Dec 22, 2022 Knowledge Distillation
— Unverified 0UNIKD: UNcertainty-filtered Incremental Knowledge Distillation for Neural Implicit Representation Dec 21, 2022 3D Reconstruction Incremental Learning
Code Code Available 0RangeAugment: Efficient Online Augmentation with Range Learning Dec 20, 2022 Knowledge Distillation object-detection
— Unverified 0Fine-Grained Distillation for Long Document Retrieval Dec 20, 2022 Knowledge Distillation Retrieval
— Unverified 0Diffusion Glancing Transformer for Parallel Sequence to Sequence Learning Dec 20, 2022 Knowledge Distillation Machine Translation
— Unverified 0Adam: Dense Retrieval Distillation with Adaptive Dark Examples Dec 20, 2022 Knowledge Distillation Retrieval
— Unverified 0Multi-View Knowledge Distillation from Crowd Annotations for Out-of-Domain Generalization Dec 19, 2022 Domain Generalization Knowledge Distillation
— Unverified 0I2D2: Inductive Knowledge Distillation with NeuroLogic and Self-Imitation Dec 19, 2022 Imitation Learning Knowledge Distillation
— Unverified 0KNIFE: Distilling Reasoning Knowledge From Free-Text Rationales Dec 19, 2022 Knowledge Distillation Language Modelling
— Unverified 0Learning Object-level Point Augmentor for Semi-supervised 3D Object Detection Dec 19, 2022 3D Object Detection Knowledge Distillation
Code Code Available 1Continual Knowledge Distillation for Neural Machine Translation Dec 18, 2022 Knowledge Distillation Machine Translation
Code Code Available 03D Point Cloud Pre-training with Knowledge Distillation from 2D Images Dec 17, 2022 Concept Alignment Knowledge Distillation
— Unverified 0Teaching Small Language Models to Reason Dec 16, 2022 GSM8K Knowledge Distillation
— Unverified 0Swing Distillation: A Privacy-Preserving Knowledge Distillation Framework Dec 16, 2022 Knowledge Distillation Model Compression
— Unverified 0Autoencoders as Cross-Modal Teachers: Can Pretrained 2D Image Transformers Help 3D Representation Learning? Dec 16, 2022 3D Point Cloud Classification Few-Shot 3D Point Cloud Classification
Code Code Available 1Gradient-based Intra-attention Pruning on Pre-trained Language Models Dec 15, 2022 Knowledge Distillation
Code Code Available 1Hybrid Paradigm-based Brain-Computer Interface for Robotic Arm Control Dec 14, 2022 Brain Computer Interface EEG
— Unverified 0Domain Adaptation for Dense Retrieval through Self-Supervision by Pseudo-Relevance Labeling Dec 13, 2022 Domain Adaptation Information Retrieval
— Unverified 0Siamese Sleep Transformer For Robust Sleep Stage Scoring With Self-knowledge Distillation and Selective Batch Sampling Dec 12, 2022 Knowledge Distillation Self-Knowledge Distillation
— Unverified 0Multimodal Matching-aware Co-attention Networks with Mutual Knowledge Distillation for Fake News Detection Dec 12, 2022 Fake News Detection Image-text matching
— Unverified 0Continuation KD: Improved Knowledge Distillation through the Lens of Continuation Optimization Dec 12, 2022 Knowledge Distillation Natural Language Understanding
— Unverified 0Towards Practical Plug-and-Play Diffusion Models Dec 12, 2022 Depth Estimation Image Generation
Code Code Available 1Improving Generalization of Pre-trained Language Models via Stochastic Weight Averaging Dec 12, 2022 Knowledge Distillation Question Answering
— Unverified 0Multi-adversarial Faster-RCNN with Paradigm Teacher for Unrestricted Object Detection Dec 11, 2022 Domain Adaptation Knowledge Distillation
— Unverified 0Teaching What You Should Teach: A Data-Based Distillation Method Dec 11, 2022 Data Augmentation Knowledge Distillation
— Unverified 0