On the Surprising Efficacy of Distillation as an Alternative to Pre-Training Small Models Apr 4, 2024 Contrastive Learning Knowledge Distillation
Code Code Available 0Improve Knowledge Distillation via Label Revision and Data Selection Apr 3, 2024 Knowledge Distillation Model Compression
— Unverified 0Foundation Models for Structural Health Monitoring Apr 3, 2024 Anomaly Detection Knowledge Distillation
Code Code Available 0Adaptive Affinity-Based Generalization For MRI Imaging Segmentation Across Resource-Limited Settings Apr 3, 2024 Data Integration Knowledge Distillation
— Unverified 0Knowledge Distillation with Multi-granularity Mixture of Priors for Image Super-Resolution Apr 3, 2024 Image Super-Resolution Knowledge Distillation
— Unverified 0Federated Distillation: A Survey Apr 2, 2024 Federated Learning Knowledge Distillation
— Unverified 0Towards Scalable & Efficient Interaction-Aware Planning in Autonomous Vehicles using Knowledge Distillation Apr 2, 2024 Autonomous Vehicles Decision Making
— Unverified 0Task Integration Distillation for Object Detectors Apr 2, 2024 Knowledge Distillation Object
— Unverified 0Class-Incremental Few-Shot Event Detection Apr 2, 2024 Event Detection Few-Shot Learning
— Unverified 0LLM-RadJudge: Achieving Radiologist-Level Evaluation for X-Ray Report Generation Apr 1, 2024 Knowledge Distillation
— Unverified 0SUGAR: Pre-training 3D Visual Representations for Robotics Apr 1, 2024 3D Instance Segmentation 3D Object Recognition
— Unverified 0A Comprehensive Review of Knowledge Distillation in Computer Vision Apr 1, 2024 Deep Learning Knowledge Distillation
— Unverified 0Weak-to-Strong 3D Object Detection with X-Ray Distillation Mar 31, 2024 3D Object Detection Autonomous Driving
Code Code Available 0DMSSN: Distilled Mixed Spectral-Spatial Network for Hyperspectral Salient Object Detection Mar 31, 2024 Dimensionality Reduction Knowledge Distillation
Code Code Available 0De-confounded Data-free Knowledge Distillation for Handling Distribution Shifts Mar 28, 2024 Causal Inference Data-free Knowledge Distillation
— Unverified 0GOLD: Generalized Knowledge Distillation via Out-of-Distribution-Guided Language Data Generation Mar 28, 2024 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 0CRKD: Enhanced Camera-Radar Object Detection with Cross-modality Knowledge Distillation Mar 28, 2024 3D Object Detection Autonomous Driving
— Unverified 0I2CKD : Intra- and Inter-Class Knowledge Distillation for Semantic Segmentation Mar 27, 2024 Knowledge Distillation Segmentation
— Unverified 0Enhancing Metaphor Detection through Soft Labels and Target Word Prediction Mar 27, 2024 Knowledge Distillation Prompt Learning
— Unverified 0Is Modularity Transferable? A Case Study through the Lens of Knowledge Distillation Mar 27, 2024 Domain Adaptation Knowledge Distillation
Code Code Available 0Oh! We Freeze: Improving Quantized Knowledge Distillation via Signal Propagation Analysis for Large Language Models Mar 26, 2024 Knowledge Distillation Quantization
— Unverified 0Order of Compression: A Systematic and Optimal Sequence to Combinationally Compress CNN Mar 26, 2024 Knowledge Distillation Model Compression
— Unverified 0From Two-Stream to One-Stream: Efficient RGB-T Tracking via Mutual Prompt Learning and Knowledge Distillation Mar 25, 2024 Knowledge Distillation Object Tracking
— Unverified 0Configurable Holography: Towards Display and Scene Adaptation Mar 24, 2024 Depth Estimation Knowledge Distillation
— Unverified 0Learning to Project for Cross-Task Knowledge Distillation Mar 21, 2024 Depth Estimation Knowledge Distillation
— Unverified 0Fed-RAC: Resource-Aware Clustering for Tackling Heterogeneity of Participants in Federated Learning Mar 20, 2024 Clustering Federated Learning
Code Code Available 0Facilitating Pornographic Text Detection for Open-Domain Dialogue Systems via Knowledge Distillation of Large Language Models Mar 20, 2024 Chatbot Knowledge Distillation
Code Code Available 0TransformMix: Learning Transformation and Mixing Strategies from Data Mar 19, 2024 Data Augmentation Knowledge Distillation
— Unverified 0HVDistill: Transferring Knowledge from Images to Point Clouds via Unsupervised Hybrid-View Distillation Mar 18, 2024 Knowledge Distillation NER
Code Code Available 0Scheduled Knowledge Acquisition on Lightweight Vector Symbolic Architectures for Brain-Computer Interfaces Mar 18, 2024 Feature Engineering Knowledge Distillation
— Unverified 0TTT-KD: Test-Time Training for 3D Semantic Segmentation through Knowledge Distillation from Foundation Models Mar 18, 2024 3D Semantic Segmentation Knowledge Distillation
— Unverified 0KnFu: Effective Knowledge Fusion Mar 18, 2024 Federated Learning Knowledge Distillation
— Unverified 0Multiple Teachers-Meticulous Student: A Domain Adaptive Meta-Knowledge Distillation Model for Medical Image Classification Mar 17, 2024 image-classification Image Classification
Code Code Available 0FlyKD: Graph Knowledge Distillation on the Fly with Curriculum Learning Mar 16, 2024 Knowledge Distillation
— Unverified 0LookALike: Human Mimicry based collaborative decision making Mar 16, 2024 Decision Making Knowledge Distillation
— Unverified 0Group-Mix SAM: Lightweight Solution for Industrial Assembly Line Applications Mar 15, 2024 Knowledge Distillation
— Unverified 0Select and Distill: Selective Dual-Teacher Knowledge Transfer for Continual Learning on Vision-Language Models Mar 14, 2024 Continual Learning Knowledge Distillation
— Unverified 0Adapting OC20-trained EquiformerV2 Models for High-Entropy Materials Mar 14, 2024 Knowledge Distillation
— Unverified 0Open-Vocabulary Object Detection with Meta Prompt Representation and Instance Contrastive Optimization Mar 14, 2024 Contrastive Learning Knowledge Distillation
— Unverified 0MT-PATCHER: Selective and Extendable Knowledge Distillation from Large Language Models for Machine Translation Mar 14, 2024 Knowledge Distillation Machine Translation
Code Code Available 0An Efficient End-to-End Approach to Noise Invariant Speech Features via Multi-Task Learning Mar 13, 2024 Denoising Knowledge Distillation
Code Code Available 0CoroNetGAN: Controlled Pruning of GANs via Hypernetworks Mar 13, 2024 Knowledge Distillation
— Unverified 0LIX: Implicitly Infusing Spatial Geometric Prior Knowledge into Visual Semantic Segmentation for Autonomous Driving Mar 13, 2024 Autonomous Driving Knowledge Distillation
— Unverified 0Training Self-localization Models for Unseen Unfamiliar Places via Teacher-to-Student Data-Free Knowledge Transfer Mar 13, 2024 Continual Learning Image Retrieval
— Unverified 0Distilling Named Entity Recognition Models for Endangered Species from Large Language Models Mar 13, 2024 In-Context Learning Knowledge Distillation
— Unverified 0Low-Energy On-Device Personalization for MCUs Mar 12, 2024 Knowledge Distillation Transfer Learning
Code Code Available 0Distilling the Knowledge in Data Pruning Mar 12, 2024 Knowledge Distillation
— Unverified 0MEND: Meta dEmonstratioN Distillation for Efficient and Effective In-Context Learning Mar 11, 2024 Decoder In-Context Learning
Code Code Available 0One Category One Prompt: Dataset Distillation using Diffusion Models Mar 11, 2024 Dataset Distillation Knowledge Distillation
— Unverified 0AuG-KD: Anchor-Based Mixup Generation for Out-of-Domain Knowledge Distillation Mar 11, 2024 Data-free Knowledge Distillation Knowledge Distillation
Code Code Available 0