Training Domain Draft Models for Speculative Decoding: Best Practices and Insights Mar 10, 2025 Knowledge Distillation
— Unverified 0Distilling Knowledge into Quantum Vision Transformers for Biomedical Image Classification Mar 10, 2025 image-classification Image Classification
— Unverified 0PTMs-TSCIL Pre-Trained Models Based Class-Incremental Learning Mar 10, 2025 class-incremental learning Class Incremental Learning
— Unverified 0Task-Specific Knowledge Distillation from the Vision Foundation Model for Enhanced Medical Image Segmentation Mar 10, 2025 Image Segmentation Knowledge Distillation
— Unverified 0ADROIT: A Self-Supervised Framework for Learning Robust Representations for Active Learning Mar 10, 2025 Active Learning Knowledge Distillation
— Unverified 0CoT-Drive: Efficient Motion Forecasting for Autonomous Driving with LLMs and Chain-of-Thought Prompting Mar 10, 2025 Autonomous Driving Knowledge Distillation
— Unverified 0Small Vision-Language Models: A Survey on Compact Architectures and Techniques Mar 9, 2025 Computational Efficiency Knowledge Distillation
— Unverified 0Causality Enhanced Origin-Destination Flow Prediction in Data-Scarce Cities Mar 9, 2025 Graph Attention Knowledge Distillation
— Unverified 0HFedCKD: Toward Robust Heterogeneous Federated Learning via Data-free Knowledge Distillation and Two-way Contrast Mar 9, 2025 Data-free Knowledge Distillation Federated Learning
— Unverified 0Asymmetric Decision-Making in Online Knowledge Distillation:Unifying Consensus and Divergence Mar 9, 2025 Decision Making Knowledge Distillation
— Unverified 0Improving SAM for Camouflaged Object Detection via Dual Stream Adapters Mar 8, 2025 Knowledge Distillation object-detection
— Unverified 0ACAM-KD: Adaptive and Cooperative Attention Masking for Knowledge Distillation Mar 8, 2025 Autonomous Driving feature selection
— Unverified 0Semantic Shift Estimation via Dual-Projection and Classifier Reconstruction for Exemplar-Free Class-Incremental Learning Mar 7, 2025 class-incremental learning Class Incremental Learning
Code Code Available 1Spatial Distillation based Distribution Alignment (SDDA) for Cross-Headset EEG Classification Mar 7, 2025 Brain Computer Interface Domain Adaptation
Code Code Available 1No Forgetting Learning: Memory-free Continual Learning Mar 6, 2025 Continual Learning Knowledge Distillation
— Unverified 0Lightweight Embedded FPGA Deployment of Learned Image Compression with Knowledge Distillation and Hybrid Quantization Mar 5, 2025 Image Compression Knowledge Distillation
— Unverified 0Self-Supervised Z-Slice Augmentation for 3D Bio-Imaging via Knowledge Distillation Mar 5, 2025 Generative Adversarial Network Knowledge Distillation
Code Code Available 0Temporal Separation with Entropy Regularization for Knowledge Distillation in Spiking Neural Networks Mar 5, 2025 Computational Efficiency Knowledge Distillation
— Unverified 0Rapid Bone Scintigraphy Enhancement via Semantic Prior Distillation from Segment Anything Model Mar 4, 2025 Image Restoration Knowledge Distillation
— Unverified 0Mamba base PKD for efficient knowledge compression Mar 3, 2025 image-classification Image Classification
— Unverified 0DILEMMA: Joint LLM Quantization and Distributed LLM Inference Over Edge Computing Systems Mar 3, 2025 Edge-computing Knowledge Distillation
— Unverified 0VRM: Knowledge Distillation via Virtual Relation Matching Feb 28, 2025 Knowledge Distillation Relation
— Unverified 0Investigating and Enhancing Vision-Audio Capability in Omnimodal Large Language Models Feb 27, 2025 Knowledge Distillation Self-Knowledge Distillation
— Unverified 0SEKI: Self-Evolution and Knowledge Inspiration based Neural Architecture Search via Large Language Models Feb 27, 2025 GPU Knowledge Distillation
— Unverified 0Granite Embedding Models Feb 27, 2025 Information Retrieval Knowledge Distillation
— Unverified 0XCOMPS: A Multilingual Benchmark of Conceptual Minimal Pairs Feb 27, 2025 Knowledge Distillation
— Unverified 0Lightweight Contrastive Distilled Hashing for Online Cross-modal Retrieval Feb 27, 2025 Cross-Modal Retrieval Knowledge Distillation
— Unverified 0Beyond the Tip of Efficiency: Uncovering the Submerged Threats of Jailbreak Attacks in Small Language Models Feb 27, 2025 Knowledge Distillation Model Compression
— Unverified 0Winning Big with Small Models: Knowledge Distillation vs. Self-Training for Reducing Hallucination in QA Agents Feb 26, 2025 Hallucination Knowledge Distillation
— Unverified 0AfroXLMR-Comet: Multilingual Knowledge Distillation with Attention Matching for Low-Resource languages Feb 25, 2025 Knowledge Distillation Language Modeling
— Unverified 0Advantage-Guided Distillation for Preference Alignment in Small Language Models Feb 25, 2025 Knowledge Distillation
Code Code Available 1From underwater to aerial: a novel multi-scale knowledge distillation approach for coral reef monitoring Feb 25, 2025 Knowledge Distillation
Code Code Available 0Knowledge Distillation with Training Wheels Feb 24, 2025 Knowledge Distillation Language Modeling
— Unverified 0CLIMB-3D: Continual Learning for Imbalanced 3D Instance Segmentation Feb 24, 2025 3D Instance Segmentation Continual Learning
Code Code Available 0PQDAST: Depth-Aware Arbitrary Style Transfer for Games via Perceptual Quality-Guided Distillation Feb 24, 2025 Knowledge Distillation Style Transfer
— Unverified 0A Transformer-in-Transformer Network Utilizing Knowledge Distillation for Image Recognition Feb 24, 2025 image-classification Image Classification
— Unverified 0Implicit Word Reordering with Knowledge Distillation for Cross-Lingual Dependency Parsing Feb 24, 2025 Cross-Lingual Transfer Dependency Parsing
— Unverified 0CoT2Align: Cross-Chain of Thought Distillation via Optimal Transport Alignment for Language Models with Different Tokenizers Feb 24, 2025 Knowledge Distillation
— Unverified 0Improving the Transferability of Adversarial Examples by Inverse Knowledge Distillation Feb 24, 2025 Adversarial Attack Diversity
— Unverified 0EDocNet: Efficient Datasheet Layout Analysis Based on Focus and Global Knowledge Distillation Feb 23, 2025 Document Layout Analysis Knowledge Distillation
— Unverified 0Scaling Sparse and Dense Retrieval in Decoder-Only LLMs Feb 21, 2025 Decoder Knowledge Distillation
Code Code Available 1PPC-GPT: Federated Task-Specific Compression of Large Language Models via Pruning and Chain-of-Thought Distillation Feb 21, 2025 Knowledge Distillation Privacy Preserving
— Unverified 0A Knowledge Distillation-Based Approach to Enhance Transparency of Classifier Models Feb 21, 2025 Decision Making Knowledge Distillation
Code Code Available 0TimeDistill: Efficient Long-Term Time Series Forecasting with MLP via Cross-Architecture Distillation Feb 20, 2025 Data Augmentation Knowledge Distillation
— Unverified 0Self-supervised Monocular Depth Estimation Robust to Reflective Surface Leveraged by Triplet Mining Feb 20, 2025 Depth Estimation Knowledge Distillation
— Unverified 0Efficient AI in Practice: Training and Deployment of Efficient LLMs for Industry Applications Feb 20, 2025 Knowledge Distillation Model Compression
— Unverified 0Modifying Final Splits of Classification Tree for Fine-tuning Subpopulation Target in Policy Making Feb 20, 2025 Knowledge Distillation
— Unverified 0Vision Foundation Models in Medical Image Analysis: Advances and Challenges Feb 20, 2025 Domain Adaptation Federated Learning
— Unverified 0Designing Parameter and Compute Efficient Diffusion Transformers using Distillation Feb 20, 2025 Knowledge Distillation NVIDIA Jetson Orin Nano
— Unverified 0Dynamic Activation with Knowledge Distillation for Energy-Efficient Spiking NN Ensembles Feb 19, 2025 Disentanglement Ensemble Learning
— Unverified 0