Okay, Let's Do This! Modeling Event Coreference with Generated Rationales and Knowledge Distillation Apr 4, 2024 Clustering coreference-resolution
Code Code Available 0On the Surprising Efficacy of Distillation as an Alternative to Pre-Training Small Models Apr 4, 2024 Contrastive Learning Knowledge Distillation
Code Code Available 0Can Small Language Models Help Large Language Models Reason Better?: LM-Guided Chain-of-Thought Apr 4, 2024 Extractive Question-Answering Knowledge Distillation
— Unverified 0Improve Knowledge Distillation via Label Revision and Data Selection Apr 3, 2024 Knowledge Distillation Model Compression
— Unverified 0Knowledge Distillation with Multi-granularity Mixture of Priors for Image Super-Resolution Apr 3, 2024 Image Super-Resolution Knowledge Distillation
— Unverified 0Rethinking Kullback-Leibler Divergence in Knowledge Distillation for Large Language Models Apr 3, 2024 Diversity Knowledge Distillation
Code Code Available 1Adaptive Affinity-Based Generalization For MRI Imaging Segmentation Across Resource-Limited Settings Apr 3, 2024 Data Integration Knowledge Distillation
— Unverified 0Foundation Models for Structural Health Monitoring Apr 3, 2024 Anomaly Detection Knowledge Distillation
Code Code Available 0Rethinking Pruning for Vision-Language Models: Strategies for Effective Sparsity and Performance Restoration Apr 3, 2024 Knowledge Distillation
Code Code Available 1Federated Distillation: A Survey Apr 2, 2024 Federated Learning Knowledge Distillation
— Unverified 0Task Integration Distillation for Object Detectors Apr 2, 2024 Knowledge Distillation Object
— Unverified 0Class-Incremental Few-Shot Event Detection Apr 2, 2024 Event Detection Few-Shot Learning
— Unverified 0TSCM: A Teacher-Student Model for Vision Place Recognition Using Cross-Metric Knowledge Distillation Apr 2, 2024 Knowledge Distillation Visual Place Recognition
Code Code Available 1Towards Scalable & Efficient Interaction-Aware Planning in Autonomous Vehicles using Knowledge Distillation Apr 2, 2024 Autonomous Vehicles Decision Making
— Unverified 0Pre-trained Vision and Language Transformers Are Few-Shot Incremental Learners Apr 2, 2024 class-incremental learning Class Incremental Learning
Code Code Available 2A Comprehensive Review of Knowledge Distillation in Computer Vision Apr 1, 2024 Deep Learning Knowledge Distillation
— Unverified 0LLM-RadJudge: Achieving Radiologist-Level Evaluation for X-Ray Report Generation Apr 1, 2024 Knowledge Distillation
— Unverified 0PDF: A Probability-Driven Framework for Open World 3D Point Cloud Semantic Segmentation Apr 1, 2024 Decoder Knowledge Distillation
Code Code Available 1SUGAR: Pre-training 3D Visual Representations for Robotics Apr 1, 2024 3D Instance Segmentation 3D Object Recognition
— Unverified 0Weak-to-Strong 3D Object Detection with X-Ray Distillation Mar 31, 2024 3D Object Detection Autonomous Driving
Code Code Available 0DMSSN: Distilled Mixed Spectral-Spatial Network for Hyperspectral Salient Object Detection Mar 31, 2024 Dimensionality Reduction Knowledge Distillation
Code Code Available 0Orchestrate Latent Expertise: Advancing Online Continual Learning with Multi-Level Supervision and Reverse Self-Distillation Mar 30, 2024 Continual Learning Knowledge Distillation
Code Code Available 1ECLIPSE: Efficient Continual Learning in Panoptic Segmentation with Visual Prompt Tuning Mar 29, 2024 Continual Learning Continual Panoptic Segmentation
Code Code Available 2GOLD: Generalized Knowledge Distillation via Out-of-Distribution-Guided Language Data Generation Mar 28, 2024 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 0De-confounded Data-free Knowledge Distillation for Handling Distribution Shifts Mar 28, 2024 Causal Inference Data-free Knowledge Distillation
— Unverified 0CRKD: Enhanced Camera-Radar Object Detection with Cross-modality Knowledge Distillation Mar 28, 2024 3D Object Detection Autonomous Driving
— Unverified 0I2CKD : Intra- and Inter-Class Knowledge Distillation for Semantic Segmentation Mar 27, 2024 Knowledge Distillation Segmentation
— Unverified 0Enhancing Metaphor Detection through Soft Labels and Target Word Prediction Mar 27, 2024 Knowledge Distillation Prompt Learning
— Unverified 0Is Modularity Transferable? A Case Study through the Lens of Knowledge Distillation Mar 27, 2024 Domain Adaptation Knowledge Distillation
Code Code Available 0Oh! We Freeze: Improving Quantized Knowledge Distillation via Signal Propagation Analysis for Large Language Models Mar 26, 2024 Knowledge Distillation Quantization
— Unverified 0KDMCSE: Knowledge Distillation Multimodal Sentence Embeddings with Adaptive Angular margin Contrastive Learning Mar 26, 2024 Contrastive Learning Knowledge Distillation
Code Code Available 1Order of Compression: A Systematic and Optimal Sequence to Combinationally Compress CNN Mar 26, 2024 Knowledge Distillation Model Compression
— Unverified 0From Two-Stream to One-Stream: Efficient RGB-T Tracking via Mutual Prompt Learning and Knowledge Distillation Mar 25, 2024 Knowledge Distillation Object Tracking
— Unverified 0ToXCL: A Unified Framework for Toxic Speech Detection and Explanation Mar 25, 2024 Decoder Knowledge Distillation
Code Code Available 1Configurable Holography: Towards Display and Scene Adaptation Mar 24, 2024 Depth Estimation Knowledge Distillation
— Unverified 0iDAT: inverse Distillation Adapter-Tuning Mar 23, 2024 image-classification Image Classification
Code Code Available 1Learning to Project for Cross-Task Knowledge Distillation Mar 21, 2024 Depth Estimation Knowledge Distillation
— Unverified 0MMIDR: Teaching Large Language Model to Interpret Multimodal Misinformation via Knowledge Distillation Mar 21, 2024 Data Augmentation Decision Making
Code Code Available 1Fed-RAC: Resource-Aware Clustering for Tackling Heterogeneity of Participants in Federated Learning Mar 20, 2024 Clustering Federated Learning
Code Code Available 0Facilitating Pornographic Text Detection for Open-Domain Dialogue Systems via Knowledge Distillation of Large Language Models Mar 20, 2024 Chatbot Knowledge Distillation
Code Code Available 0Instruction Multi-Constraint Molecular Generation Using a Teacher-Student Large Language Model Mar 20, 2024 Drug Discovery Knowledge Distillation
Code Code Available 1Scale Decoupled Distillation Mar 20, 2024 Knowledge Distillation
Code Code Available 2TransformMix: Learning Transformation and Mixing Strategies from Data Mar 19, 2024 Data Augmentation Knowledge Distillation
— Unverified 0Scheduled Knowledge Acquisition on Lightweight Vector Symbolic Architectures for Brain-Computer Interfaces Mar 18, 2024 Feature Engineering Knowledge Distillation
— Unverified 0KnFu: Effective Knowledge Fusion Mar 18, 2024 Federated Learning Knowledge Distillation
— Unverified 0HVDistill: Transferring Knowledge from Images to Point Clouds via Unsupervised Hybrid-View Distillation Mar 18, 2024 Knowledge Distillation NER
Code Code Available 0TTT-KD: Test-Time Training for 3D Semantic Segmentation through Knowledge Distillation from Foundation Models Mar 18, 2024 3D Semantic Segmentation Knowledge Distillation
— Unverified 0Multiple Teachers-Meticulous Student: A Domain Adaptive Meta-Knowledge Distillation Model for Medical Image Classification Mar 17, 2024 image-classification Image Classification
Code Code Available 0Self-Supervised Quantization-Aware Knowledge Distillation Mar 17, 2024 Knowledge Distillation Quantization
Code Code Available 1FlyKD: Graph Knowledge Distillation on the Fly with Curriculum Learning Mar 16, 2024 Knowledge Distillation
— Unverified 0