CLIP-CID: Efficient CLIP Distillation via Cluster-Instance Discrimination Aug 18, 2024 Knowledge Distillation Transfer Learning
— Unverified 0MedMAP: Promoting Incomplete Multi-modal Brain Tumor Segmentation with Alignment Aug 18, 2024 Brain Tumor Segmentation Domain Adaptation
— Unverified 0V2X-VLM: End-to-End V2X Cooperative Autonomous Driving Through Large Vision-Language Models Aug 17, 2024 Autonomous Driving Contrastive Learning
— Unverified 0Multi Teacher Privileged Knowledge Distillation for Multimodal Expression Recognition Aug 16, 2024 Emotion Recognition Knowledge Distillation
Code Code Available 0MIDAS: Multi-level Intent, Domain, And Slot Knowledge Distillation for Multi-turn NLU Aug 15, 2024 domain classification Intent Detection
Code Code Available 0Towards Real-time Video Compressive Sensing on Mobile Devices Aug 14, 2024 Compressive Sensing Knowledge Distillation
Code Code Available 0One Step Diffusion-based Super-Resolution with Time-Aware Distillation Aug 14, 2024 Image Super-Resolution Knowledge Distillation
Code Code Available 1FedQUIT: On-Device Federated Unlearning via a Quasi-Competent Virtual Teacher Aug 14, 2024 Federated Learning Knowledge Distillation
— Unverified 0Knowledge Distillation with Refined Logits Aug 14, 2024 Knowledge Distillation Model Compression
Code Code Available 1Using Advanced LLMs to Enhance Smaller LLMs: An Interpretable Knowledge Distillation Approach Aug 13, 2024 Knowledge Distillation
— Unverified 0Optimizing Vision Transformers with Data-Free Knowledge Transfer Aug 12, 2024 Knowledge Distillation object-detection
— Unverified 0Low-Dimensional Federated Knowledge Graph Embedding via Knowledge Distillation Aug 11, 2024 Graph Embedding Knowledge Distillation
— Unverified 0LaDiMo: Layer-wise Distillation Inspired MoEfier Aug 8, 2024 Knowledge Distillation Mixture-of-Experts
— Unverified 0ComKD-CLIP: Comprehensive Knowledge Distillation for Contrastive Language-Image Pre-traning Model Aug 8, 2024 Contrastive Learning Knowledge Distillation
— Unverified 0Distillation Learning Guided by Image Reconstruction for One-Shot Medical Image Segmentation Aug 7, 2024 Data Augmentation Image Reconstruction
Code Code Available 0Real-time Event Recognition of Long-distance Distributed Vibration Sensing with Knowledge Distillation and Hardware Acceleration Aug 7, 2024 GPU Intrusion Detection
Code Code Available 1Dual-Modeling Decouple Distillation for Unsupervised Anomaly Detection Aug 7, 2024 Anomaly Detection Anomaly Localization
— Unverified 0EEGMobile: Enhancing Speed and Accuracy in EEG-Based Gaze Prediction with Advanced Mobile Architectures Aug 6, 2024 Brain Computer Interface EEG
— Unverified 0Leveraging Entity Information for Cross-Modality Correlation Learning: The Entity-Guided Multimodal Summarization Aug 6, 2024 Knowledge Distillation Language Modeling
Code Code Available 0Inference Optimizations for Large Language Models: Effects, Challenges, and Practical Considerations Aug 6, 2024 Knowledge Distillation Navigate
— Unverified 0Comb, Prune, Distill: Towards Unified Pruning for Vision Model Compression Aug 6, 2024 image-classification Image Classification
Code Code Available 0VizECGNet: Visual ECG Image Network for Cardiovascular Diseases Classification with Multi-Modal Training and Knowledge Distillation Aug 6, 2024 ECG Classification Knowledge Distillation
— Unverified 0Low-Cost Self-Ensembles Based on Multi-Branch Transformation and Grouped Convolution Aug 5, 2024 Classification Diversity
Code Code Available 0An approach to optimize inference of the DIART speaker diarization pipeline Aug 5, 2024 Inference Optimization Knowledge Distillation
— Unverified 0Unsupervised Domain Adaption Harnessing Vision-Language Pre-training Aug 5, 2024 Domain Adaptation Knowledge Distillation
Code Code Available 1Do You Remember . . . the Future? Weak-to-Strong generalization in 3D Object Detection Aug 3, 2024 3D Object Detection Knowledge Distillation
Code Code Available 0Exploiting the Semantic Knowledge of Pre-trained Text-Encoders for Continual Learning Aug 2, 2024 Continual Learning Knowledge Distillation
Code Code Available 0DistillGrasp: Integrating Features Correlation with Knowledge Distillation for Depth Completion of Transparent Objects Aug 1, 2024 Depth Completion Feature Correlation
— Unverified 0Sentence-wise Speech Summarization: Task, Datasets, and End-to-End Modeling with LM Knowledge Distillation Aug 1, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0StyleRF-VolVis: Style Transfer of Neural Radiance Fields for Expressive Volume Visualization Jul 31, 2024 Knowledge Distillation NeRF
— Unverified 0Gemma 2: Improving Open Language Models at a Practical Size Jul 31, 2024 Knowledge Distillation
— Unverified 0Lifelong Person Search Jul 31, 2024 Knowledge Distillation Person Search
— Unverified 0Dynamic Object Queries for Transformer-based Incremental Object Detection Jul 31, 2024 Knowledge Distillation Object
— Unverified 0VIPeR: Visual Incremental Place Recognition with Adaptive Mining and Continual Learning Jul 31, 2024 Continual Learning Knowledge Distillation
— Unverified 0Learning Effective Representations for Retrieval Using Self-Distillation with Adaptive Relevance Margins Jul 31, 2024 Knowledge Distillation Language Modeling
— Unverified 0Pruning Large Language Models with Semi-Structural Adaptive Sparse Training Jul 30, 2024 GPU Knowledge Distillation
Code Code Available 1SalNAS: Efficient Saliency-prediction Neural Architecture Search with self-knowledge distillation Jul 29, 2024 Decoder Knowledge Distillation
Code Code Available 0ActivityCLIP: Enhancing Group Activity Recognition by Mining Complementary Information from Text to Supplement Image Modality Jul 29, 2024 Activity Recognition Group Activity Recognition
— Unverified 0Overcoming Uncertain Incompleteness for Robust Multimodal Sequential Diagnosis Prediction via Curriculum Data Erasing Guided Knowledge Distillation Jul 28, 2024 Knowledge Distillation Sequential Diagnosis
Code Code Available 0Mixture of Modular Experts: Distilling Knowledge from a Multilingual Teacher into Specialized Modular Language Models Jul 28, 2024 Knowledge Distillation Mixture-of-Experts
Code Code Available 0LLAVADI: What Matters For Multimodal Large Language Models Distillation Jul 28, 2024 Knowledge Distillation
— Unverified 0Logic Distillation: Learning from Code Function by Function for Planning and Decision-making Jul 28, 2024 Decision Making Knowledge Distillation
— Unverified 0Sewer Image Super-Resolution with Depth Priors and Its Lightweight Network Jul 27, 2024 Computational Efficiency Image Super-Resolution
— Unverified 0Modality-Balanced Learning for Multimedia Recommendation Jul 26, 2024 Collaborative Filtering counterfactual
Code Code Available 1Boosting Cross-Domain Point Classification via Distilling Relational Priors from 2D Transformers Jul 26, 2024 Domain Adaptation Domain Generalization
Code Code Available 0Towards A Generalizable Pathology Foundation Model via Unified Knowledge Distillation Jul 26, 2024 Knowledge Distillation Question Answering
Code Code Available 2FedUD: Exploiting Unaligned Data for Cross-Platform Federated Click-Through Rate Prediction Jul 26, 2024 Click-Through Rate Prediction Federated Learning
— Unverified 0Leveraging Foundation Models via Knowledge Distillation in Multi-Object Tracking: Distilling DINOv2 Features to FairMOT Jul 25, 2024 Knowledge Distillation Multi-Object Tracking
Code Code Available 0How to Train the Teacher Model for Effective Knowledge Distillation Jul 25, 2024 Knowledge Distillation
Code Code Available 0NC-NCD: Novel Class Discovery for Node Classification Jul 25, 2024 Classification Knowledge Distillation
Code Code Available 0