Overcoming Uncertain Incompleteness for Robust Multimodal Sequential Diagnosis Prediction via Curriculum Data Erasing Guided Knowledge Distillation Jul 28, 2024 Knowledge Distillation Sequential Diagnosis
Code Code Available 0Sewer Image Super-Resolution with Depth Priors and Its Lightweight Network Jul 27, 2024 Computational Efficiency Image Super-Resolution
— Unverified 0Boosting Cross-Domain Point Classification via Distilling Relational Priors from 2D Transformers Jul 26, 2024 Domain Adaptation Domain Generalization
Code Code Available 0FedUD: Exploiting Unaligned Data for Cross-Platform Federated Click-Through Rate Prediction Jul 26, 2024 Click-Through Rate Prediction Federated Learning
— Unverified 0Leveraging Foundation Models via Knowledge Distillation in Multi-Object Tracking: Distilling DINOv2 Features to FairMOT Jul 25, 2024 Knowledge Distillation Multi-Object Tracking
Code Code Available 0Peak-Controlled Logits Poisoning Attack in Federated Distillation Jul 25, 2024 Knowledge Distillation Transfer Learning
— Unverified 0Separating Novel Features for Logical Anomaly Detection: A Straightforward yet Effective Approach Jul 25, 2024 Anomaly Detection Knowledge Distillation
— Unverified 0How to Train the Teacher Model for Effective Knowledge Distillation Jul 25, 2024 Knowledge Distillation
Code Code Available 0NC-NCD: Novel Class Discovery for Node Classification Jul 25, 2024 Classification Knowledge Distillation
Code Code Available 0CoMoTo: Unpaired Cross-Modal Lesion Distillation Improves Breast Lesion Detection in Tomosynthesis Jul 24, 2024 Knowledge Distillation Lesion Detection
Code Code Available 0DDK: Distilling Domain Knowledge for Efficient Large Language Models Jul 23, 2024 Knowledge Distillation
— Unverified 0Disentangling spatio-temporal knowledge for weakly supervised object detection and segmentation in surgical video Jul 22, 2024 Disentanglement Knowledge Distillation
Code Code Available 0Generalizing Teacher Networks for Effective Knowledge Distillation Across Student Architectures Jul 22, 2024 Knowledge Distillation Model Compression
Code Code Available 0Synthetic Image Learning: Preserving Performance and Preventing Membership Inference Attacks Jul 22, 2024 Knowledge Distillation
— Unverified 0Comprehensive Study on Performance Evaluation and Optimization of Model Compression: Bridging Traditional Deep Learning and Large Language Models Jul 22, 2024 Deep Learning image-classification
— Unverified 0SeqMIA: Sequential-Metric Based Membership Inference Attack Jul 21, 2024 Inference Attack Knowledge Distillation
Code Code Available 0Distilling Vision-Language Foundation Models: A Data-Free Approach via Prompt Diversification Jul 21, 2024 Data-free Knowledge Distillation Image Generation
— Unverified 0Teach Harder, Learn Poorer: Rethinking Hard Sample Distillation for GNN-to-MLP Knowledge Distillation Jul 20, 2024 Knowledge Distillation
Code Code Available 0Continual Panoptic Perception: Towards Multi-modal Incremental Interpretation of Remote Sensing Images Jul 19, 2024 Caption Generation Continual Learning
Code Code Available 0DFMSD: Dual Feature Masking Stage-wise Knowledge Distillation for Object Detection Jul 18, 2024 Knowledge Distillation Object
— Unverified 0Open Vocabulary 3D Scene Understanding via Geometry Guided Self-Distillation Jul 18, 2024 Knowledge Distillation Representation Learning
— Unverified 0QuIIL at T3 challenge: Towards Automation in Life-Saving Intervention Procedures from First-Person View Jul 18, 2024 Action Anticipation Action Recognition
Code Code Available 0Make a Strong Teacher with Label Assistance: A Novel Knowledge Distillation Approach for Semantic Segmentation Jul 18, 2024 Knowledge Distillation Semantic Segmentation
Code Code Available 0Continual Distillation Learning: Knowledge Distillation in Prompt-based Continual Learning Jul 18, 2024 Continual Learning Knowledge Distillation
— Unverified 0Discovery of novel antimicrobial peptides with notable antibacterial potency by a LLM-based foundation model Jul 17, 2024 Knowledge Distillation scientific discovery
— Unverified 0Learning Modality-agnostic Representation for Semantic Segmentation from Any Modalities Jul 16, 2024 Knowledge Distillation Semantic Segmentation
— Unverified 0MapDistill: Boosting Efficient Camera-based HD Map Construction via Camera-LiDAR Fusion Model Distillation Jul 16, 2024 Autonomous Driving Knowledge Distillation
— Unverified 0Leave No Knowledge Behind During Knowledge Distillation: Towards Practical and Effective Knowledge Distillation for Code-Switching ASR Using Realistic Data Jul 15, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Don't Throw Away Data: Better Sequence Knowledge Distillation Jul 15, 2024 Diversity Knowledge Distillation
— Unverified 0Multi-Granularity Semantic Revision for Large Language Model Distillation Jul 14, 2024 Knowledge Distillation Language Modeling
— Unverified 0Enhancing Weakly-Supervised Histopathology Image Segmentation with Knowledge Distillation on MIL-Based Pseudo-Labels Jul 14, 2024 Image Segmentation Knowledge Distillation
Code Code Available 0Background Adaptation with Residual Modeling for Exemplar-Free Class-Incremental Semantic Segmentation Jul 13, 2024 Class-Incremental Semantic Segmentation Exemplar-Free
— Unverified 0Minimizing PLM-Based Few-Shot Intent Detectors Jul 13, 2024 Data Augmentation Knowledge Distillation
Code Code Available 0Uplifting Range-View-based 3D Semantic Segmentation in Real-Time with Multi-Sensor Fusion Jul 12, 2024 3D Semantic Segmentation Autonomous Driving
— Unverified 0From Easy to Hard: Learning Curricular Shape-aware Features for Robust Panoptic Scene Graph Generation Jul 12, 2024 Graph Generation Knowledge Distillation
— Unverified 0A Survey on Symbolic Knowledge Distillation of Large Language Models Jul 12, 2024 Knowledge Distillation Survey
— Unverified 03M-Health: Multimodal Multi-Teacher Knowledge Distillation for Mental Health Detection Jul 12, 2024 Knowledge Distillation Social Media Mental Health Detection
Code Code Available 0SlideGCD: Slide-based Graph Collaborative Training with Knowledge Distillation for Whole Slide Image Classification Jul 12, 2024 graph construction Graph Learning
Code Code Available 0Knowledge distillation to effectively attain both region-of-interest and global semantics from an image where multiple objects appear Jul 11, 2024 Knowledge Distillation object-detection
Code Code Available 0Adaptive Deep Iris Feature Extractor at Arbitrary Resolutions Jul 11, 2024 Iris Recognition Knowledge Distillation
— Unverified 0A Guide To Effectively Leveraging LLMs for Low-Resource Text Summarization: Data Augmentation and Semi-supervised Approaches Jul 10, 2024 Abstractive Text Summarization Data Augmentation
— Unverified 0LokiLM: Technical Report Jul 10, 2024 Knowledge Distillation Language Modeling
— Unverified 0HDKD: Hybrid Data-Efficient Knowledge Distillation Network for Medical Image Classification Jul 10, 2024 Computational Efficiency image-classification
Code Code Available 0Less is More: Efficient Brain-Inspired Learning for Autonomous Driving Trajectory Prediction Jul 9, 2024 Autonomous Driving Decision Making
— Unverified 0Enhancing Low-Resource NMT with a Multilingual Encoder and Knowledge Distillation: A Case Study Jul 9, 2024 Knowledge Distillation Language Modeling
Code Code Available 0Reprogramming Distillation for Medical Foundation Models Jul 9, 2024 Knowledge Distillation Lightweight Deployment
Code Code Available 0DεpS: Delayed ε-Shrinking for Faster Once-For-All Training Jul 8, 2024 All GPU
— Unverified 0Federated Knowledge Transfer Fine-tuning Large Server Model with Resource-Constrained IoT Clients Jul 7, 2024 Federated Learning Knowledge Distillation
— Unverified 0Topological Persistence Guided Knowledge Distillation for Wearable Sensor Data Jul 7, 2024 Activity Recognition Deep Learning
— Unverified 0Leveraging Topological Guidance for Improved Knowledge Distillation Jul 7, 2024 image-classification Image Classification
Code Code Available 0