Towards Low-Latency Event Stream-based Visual Object Tracking: A Slow-Fast Approach May 19, 2025 Knowledge Distillation Object Tracking
Code Code Available 0Uniformity First: Uniformity-aware Test-time Adaptation of Vision-language Models against Image Corruption May 19, 2025 Knowledge Distillation Test-time Adaptation
Code Code Available 0LAMeTA: Intent-Aware Agentic Network Optimization via a Large AI Model-Empowered Two-Stage Approach May 18, 2025 Deep Reinforcement Learning Knowledge Distillation
— Unverified 0Always Clear Depth: Robust Monocular Depth Estimation under Adverse Weather May 18, 2025 Autonomous Driving Depth Estimation
Code Code Available 1SSR: Enhancing Depth Perception in Vision-Language Models via Rationale-Guided Spatial Reasoning May 18, 2025 Knowledge Distillation Spatial Reasoning
— Unverified 0On Membership Inference Attacks in Knowledge Distillation May 17, 2025 Knowledge Distillation Privacy Preserving
Code Code Available 0Denoising Mutual Knowledge Distillation in Bi-Directional Multiple Instance Learning May 17, 2025 Denoising image-classification
— Unverified 0FiGKD: Fine-Grained Knowledge Distillation via High-Frequency Detail Transfer May 17, 2025 Fine-Grained Visual Recognition Knowledge Distillation
— Unverified 0Semantically-Aware Game Image Quality Assessment May 16, 2025 Feature Importance Image Quality Assessment
— Unverified 0Bidirectional Distillation: A Mixed-Play Framework for Multi-Agent Generalizable Behaviors May 16, 2025 Knowledge Distillation Multi-agent Reinforcement Learning
— Unverified 0Distilled Circuits: A Mechanistic Study of Internal Restructuring in Knowledge Distillation May 16, 2025 Knowledge Distillation
Code Code Available 0Advancing Multiple Instance Learning with Continual Learning for Whole Slide Imaging May 15, 2025 Continual Learning Diagnostic
— Unverified 0DCSNet: A Lightweight Knowledge Distillation-Based Model with Explainable AI for Lung Cancer Diagnosis from Histopathological Images May 14, 2025 Diagnostic Knowledge Distillation
— Unverified 0MoKD: Multi-Task Optimization for Knowledge Distillation May 13, 2025 image-classification Image Classification
— Unverified 0Low-Complexity Inference in Continual Learning via Compressed Knowledge Transfer May 13, 2025 class-incremental learning Class Incremental Learning
— Unverified 0Foundation Models Knowledge Distillation For Battery Capacity Degradation Forecast May 13, 2025 Knowledge Distillation Time Series
Code Code Available 1Fusing Bidirectional Chains of Thought and Reward Mechanisms A Method for Enhancing Question-Answering Capabilities of Large Language Models for Chinese Intangible Cultural Heritage May 13, 2025 Knowledge Distillation Large Language Model
— Unverified 0An Extra RMSNorm is All You Need for Fine Tuning to 1.58 Bits May 12, 2025 All Knowledge Distillation
— Unverified 0Topology-Guided Knowledge Distillation for Efficient Point Cloud Processing May 12, 2025 3D Object Recognition Autonomous Driving
Code Code Available 0Channel Fingerprint Construction for Massive MIMO: A Deep Conditional Generative Approach May 12, 2025 Denoising Knowledge Distillation
— Unverified 0KDH-MLTC: Knowledge Distillation for Healthcare Multi-Label Text Classification May 12, 2025 Classification Hyperparameter Optimization
— Unverified 0Ranking-aware Continual Learning for LiDAR Place Recognition May 12, 2025 Autonomous Driving Continual Learning
— Unverified 0Structural Entropy Guided Agent for Detecting and Repairing Knowledge Deficiencies in LLMs May 12, 2025 AI Agent Knowledge Distillation
Code Code Available 2Simple Semi-supervised Knowledge Distillation from Vision-Language Models via Dual-Head Optimization May 12, 2025 Few-Shot Image Classification Knowledge Distillation
Code Code Available 0Knowledge Distillation for Enhancing Walmart E-commerce Search Relevance Using Large Language Models May 11, 2025 Knowledge Distillation
— Unverified 0Human in the Latent Loop (HILL): Interactively Guiding Model Training Through Human Intuition May 9, 2025 Knowledge Distillation
— Unverified 0Robust & Precise Knowledge Distillation-based Novel Context-Aware Predictor for Disease Detection in Brain and Gastrointestinal May 9, 2025 Disease Prediction Knowledge Distillation
— Unverified 0Federated Deconfounding and Debiasing Learning for Out-of-Distribution Generalization May 8, 2025 Attribute Benchmarking
— Unverified 0Biomed-DPT: Dual Modality Prompt Tuning for Biomedical Vision-Language Models May 8, 2025 Clinical Knowledge Diagnostic
Code Code Available 0ABKD: Pursuing a Proper Allocation of the Probability Mass in Knowledge Distillation via α-β-Divergence May 7, 2025 Knowledge Distillation
Code Code Available 1Theoretical Guarantees for LT-TTD: A Unified Transformer-based Architecture for Two-Level Ranking Systems May 7, 2025 Computational Efficiency Knowledge Distillation
— Unverified 0Action Spotting and Precise Event Detection in Sports: Datasets, Methods, and Challenges May 6, 2025 Action Localization Action Spotting
— Unverified 0SepALM: Audio Language Models Are Error Correctors for Robust Speech Separation May 6, 2025 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Knowledge Distillation for Speech Denoising by Latent Representation Alignment with Cosine Distance May 6, 2025 Denoising Knowledge Distillation
— Unverified 0Image Recognition with Online Lightweight Vision Transformer: A Survey May 6, 2025 Knowledge Distillation Survey
Code Code Available 0Artificial Behavior Intelligence: Technology, Challenges, and Future Directions May 6, 2025 Autonomous Driving Emotion Recognition
— Unverified 0End-to-end fully-binarized network design: from Generic Learned Thermometer to Block Pruning May 5, 2025 Knowledge Distillation Quantization
— Unverified 0AKD : Adversarial Knowledge Distillation For Large Language Models Alignment on Coding tasks May 5, 2025 Code Completion Code Generation
— Unverified 0Optimizing LLMs for Resource-Constrained Environments: A Survey of Model Compression Techniques May 5, 2025 Knowledge Distillation Mixture-of-Experts
— Unverified 0FedSDAF: Leveraging Source Domain Awareness for Enhanced Federated Domain Generalization May 5, 2025 Domain Generalization Knowledge Distillation
Code Code Available 0Efficient Multivariate Time Series Forecasting via Calibrated Language Models with Privileged Knowledge Distillation May 4, 2025 Knowledge Distillation Multivariate Time Series Forecasting
Code Code Available 2Segment Any RGB-Thermal Model with Language-aided Distillation May 4, 2025 Instance Segmentation Knowledge Distillation
— Unverified 0High-Fidelity Pseudo-label Generation by Large Language Models for Training Robust Radiology Report Classifiers May 3, 2025 Diagnostic Knowledge Distillation
— Unverified 0Toward Data-centric Directed Graph Learning: An Entropy-driven Approach May 2, 2025 Graph Learning Knowledge Distillation
— Unverified 0Llama-Nemotron: Efficient Reasoning Models May 2, 2025 Knowledge Distillation Neural Architecture Search
— Unverified 0Uncertainty-Aware Multi-Expert Knowledge Distillation for Imbalanced Disease Grading May 1, 2025 Knowledge Distillation Transfer Learning
— Unverified 0Enhancing New-item Fairness in Dynamic Recommender Systems Apr 30, 2025 Fairness Knowledge Distillation
Code Code Available 0CAE-DFKD: Bridging the Transferability Gap in Data-Free Knowledge Distillation Apr 30, 2025 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 0How to Backdoor the Knowledge Distillation Apr 30, 2025 Knowledge Distillation
— Unverified 0Head-Tail-Aware KL Divergence in Knowledge Distillation for Spiking Neural Networks Apr 29, 2025 Knowledge Distillation Transfer Learning
— Unverified 0