IOR: Inversed Objects Replay for Incremental Object Detection Jun 7, 2024 Knowledge Distillation Object
— Unverified 0To Distill or Not to Distill? On the Robustness of Robust Knowledge Distillation Jun 6, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
Code Code Available 0LenslessFace: An End-to-End Optimized Lensless System for Privacy-Preserving Face Verification Jun 6, 2024 Face Detection Face Verification
Code Code Available 1Step Out and Seek Around: On Warm-Start Training with Incremental Data Jun 6, 2024 Autonomous Driving Knowledge Distillation
— Unverified 0Mutual Information Guided Backdoor Mitigation for Pre-trained Encoders Jun 5, 2024 Knowledge Distillation Self-Supervised Learning
— Unverified 0Decision Boundary-aware Knowledge Consolidation Generates Better Instance-Incremental Learner Jun 5, 2024 class-incremental learning Class Incremental Learning
— Unverified 0Tiny models from tiny data: Textual and null-text inversion for few-shot distillation Jun 5, 2024 Few-Shot Image Classification image-classification
Code Code Available 0PLaD: Preference-based Large Language Model Distillation with Pseudo-Preference Pairs Jun 5, 2024 Knowledge Distillation Language Modeling
— Unverified 0Adversarial Moment-Matching Distillation of Large Language Models Jun 5, 2024 Imitation Learning Instruction Following
Code Code Available 0Multi-Task Multi-Scale Contrastive Knowledge Distillation for Efficient Medical Image Segmentation Jun 5, 2024 Contrastive Learning Image Segmentation
Code Code Available 1Optimal Transport Guided Correlation Assignment for Multimodal Entity Linking Jun 4, 2024 Entity Linking Knowledge Distillation
Code Code Available 0RKLD: Reverse KL-Divergence-based Knowledge Distillation for Unlearning Personal Information in Large Language Models Jun 4, 2024 Knowledge Distillation Language Modeling
— Unverified 0DL-KDD: Dual-Light Knowledge Distillation for Action Recognition in the Dark Jun 4, 2024 Action Recognition Knowledge Distillation
— Unverified 0Toward Efficient Deep Spiking Neuron Networks:A Survey On Compression Jun 3, 2024 Knowledge Distillation Quantization
— Unverified 0Decoupled Alignment for Robust Plug-and-Play Adaptation Jun 3, 2024 Knowledge Distillation
— Unverified 0Robust Knowledge Distillation Based on Feature Variance Against Backdoored Teacher Model Jun 1, 2024 Knowledge Distillation Model Compression
Code Code Available 0Learning Background Prompts to Discover Implicit Knowledge for Open Vocabulary Object Detection Jun 1, 2024 Knowledge Distillation Object
— Unverified 0Multi-label Class Incremental Emotion Decoding with Augmented Emotional Semantics Learning May 31, 2024 class-incremental learning Class Incremental Learning
— Unverified 0Vision-Language Meets the Skeleton: Progressively Distillation with Cross-Modal Knowledge for 3D Action Representation Learning May 31, 2024 Action Recognition Contrastive Learning
Code Code Available 0Adv-KD: Adversarial Knowledge Distillation for Faster Diffusion Sampling May 31, 2024 Denoising Image Generation
Code Code Available 0GKT: A Novel Guidance-Based Knowledge Transfer Framework For Efficient Cloud-edge Collaboration LLM Deployment May 30, 2024 GSM8K Knowledge Distillation
Code Code Available 0Distribution Aligned Semantics Adaption for Lifelong Person Re-Identification May 30, 2024 Knowledge Distillation Person Re-Identification
Code Code Available 0Scalable Detection of Salient Entities in News Articles May 30, 2024 Articles Knowledge Distillation
— Unverified 0Relation Modeling and Distillation for Learning with Noisy Labels May 30, 2024 Contrastive Learning Knowledge Distillation
— Unverified 0Improving the Training of Rectified Flows May 30, 2024 Image Generation Knowledge Distillation
Code Code Available 2Estimating Human Poses Across Datasets: A Unified Skeleton and Multi-Teacher Distillation Approach May 30, 2024 Activity Recognition Knowledge Distillation
— Unverified 0WebUOT-1M: Advancing Deep Underwater Object Tracking with A Million-Scale Benchmark May 30, 2024 Knowledge Distillation Object Tracking
— Unverified 0BLSP-KD: Bootstrapping Language-Speech Pre-training via Knowledge Distillation May 29, 2024 Instruction Following Knowledge Distillation
— Unverified 0Forward-Backward Knowledge Distillation for Continual Clustering May 29, 2024 Clustering Continual Learning
— Unverified 0Continual Collaborative Distillation for Recommender System May 29, 2024 Knowledge Distillation Recommendation Systems
Code Code Available 1Aligning in a Compact Space: Contrastive Knowledge Distillation between Heterogeneous Architectures May 28, 2024 Contrastive Learning Knowledge Distillation
— Unverified 0SLMRec: Distilling Large Language Models into Small for Sequential Recommendation May 28, 2024 Knowledge Distillation Language Modeling
Code Code Available 1P4: Towards private, personalized, and Peer-to-Peer learning May 27, 2024 Knowledge Distillation
— Unverified 0TIMA: Text-Image Mutual Awareness for Balancing Zero-Shot Adversarial Robustness and Generalization Ability May 27, 2024 Adversarial Robustness Knowledge Distillation
— Unverified 0LoReTrack: Efficient and Accurate Low-Resolution Transformer Tracking May 27, 2024 CPU Knowledge Distillation
Code Code Available 1UniCompress: Enhancing Multi-Data Medical Image Compression with Knowledge Distillation May 27, 2024 Image Compression Knowledge Distillation
— Unverified 0Noisy Data Meets Privacy: Training Local Models with Post-Processed Remote Queries May 25, 2024 Knowledge Distillation Model extraction
— Unverified 0Rethinking Early-Fusion Strategies for Improved Multispectral Object Detection May 25, 2024 Knowledge Distillation Multispectral Object Detection
Code Code Available 1A Classifier-Free Incremental Learning Framework for Scalable Medical Image Segmentation May 25, 2024 Contrastive Learning Image Segmentation
— Unverified 0Harnessing Increased Client Participation with Cohort-Parallel Federated Learning May 24, 2024 Federated Learning image-classification
— Unverified 0Leveraging knowledge distillation for partial multi-task learning from multiple remote sensing datasets May 24, 2024 Knowledge Distillation Multi-Task Learning
Code Code Available 03D Annotation-Free Learning by Distilling 2D Open-Vocabulary Segmentation Models for Autonomous Driving May 24, 2024 Autonomous Driving Knowledge Distillation
Code Code Available 1Pre-Trained Vision-Language Models as Partial Annotators May 23, 2024 Contrastive Learning image-classification
— Unverified 0Recurrent Early Exits for Federated Learning with Heterogeneous Clients May 23, 2024 Federated Learning Knowledge Distillation
Code Code Available 1JiuZhang3.0: Efficiently Improving Mathematical Reasoning by Training Small Data Synthesis Models May 23, 2024 Knowledge Distillation Math
Code Code Available 1Awesome Multi-modal Object Tracking May 23, 2024 Autonomous Driving Knowledge Distillation
Code Code Available 5Efficient Multitask Dense Predictor via Binarization May 23, 2024 Binarization Knowledge Distillation
Code Code Available 0AdaGMLP: AdaBoosting GNN-to-MLP Knowledge Distillation May 23, 2024 Knowledge Distillation
Code Code Available 0Data-Free Federated Class Incremental Learning with Diffusion-Based Generative Memory May 22, 2024 class-incremental learning Class Incremental Learning
— Unverified 0Joint Optimization of Streaming and Non-Streaming Automatic Speech Recognition with Multi-Decoder and Knowledge Distillation May 22, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0