XCOMPS: A Multilingual Benchmark of Conceptual Minimal Pairs Feb 27, 2025 Knowledge Distillation
— Unverified 0Beyond the Tip of Efficiency: Uncovering the Submerged Threats of Jailbreak Attacks in Small Language Models Feb 27, 2025 Knowledge Distillation Model Compression
— Unverified 0SEKI: Self-Evolution and Knowledge Inspiration based Neural Architecture Search via Large Language Models Feb 27, 2025 GPU Knowledge Distillation
— Unverified 0Lightweight Contrastive Distilled Hashing for Online Cross-modal Retrieval Feb 27, 2025 Cross-Modal Retrieval Knowledge Distillation
— Unverified 0Granite Embedding Models Feb 27, 2025 Information Retrieval Knowledge Distillation
— Unverified 0Winning Big with Small Models: Knowledge Distillation vs. Self-Training for Reducing Hallucination in QA Agents Feb 26, 2025 Hallucination Knowledge Distillation
— Unverified 0AfroXLMR-Comet: Multilingual Knowledge Distillation with Attention Matching for Low-Resource languages Feb 25, 2025 Knowledge Distillation Language Modeling
— Unverified 0From underwater to aerial: a novel multi-scale knowledge distillation approach for coral reef monitoring Feb 25, 2025 Knowledge Distillation
Code Code Available 0Improving the Transferability of Adversarial Examples by Inverse Knowledge Distillation Feb 24, 2025 Adversarial Attack Diversity
— Unverified 0Implicit Word Reordering with Knowledge Distillation for Cross-Lingual Dependency Parsing Feb 24, 2025 Cross-Lingual Transfer Dependency Parsing
— Unverified 0CoT2Align: Cross-Chain of Thought Distillation via Optimal Transport Alignment for Language Models with Different Tokenizers Feb 24, 2025 Knowledge Distillation
— Unverified 0A Transformer-in-Transformer Network Utilizing Knowledge Distillation for Image Recognition Feb 24, 2025 image-classification Image Classification
— Unverified 0CLIMB-3D: Continual Learning for Imbalanced 3D Instance Segmentation Feb 24, 2025 3D Instance Segmentation Continual Learning
Code Code Available 0PQDAST: Depth-Aware Arbitrary Style Transfer for Games via Perceptual Quality-Guided Distillation Feb 24, 2025 Knowledge Distillation Style Transfer
— Unverified 0Knowledge Distillation with Training Wheels Feb 24, 2025 Knowledge Distillation Language Modeling
— Unverified 0EDocNet: Efficient Datasheet Layout Analysis Based on Focus and Global Knowledge Distillation Feb 23, 2025 Document Layout Analysis Knowledge Distillation
— Unverified 0A Knowledge Distillation-Based Approach to Enhance Transparency of Classifier Models Feb 21, 2025 Decision Making Knowledge Distillation
Code Code Available 0PPC-GPT: Federated Task-Specific Compression of Large Language Models via Pruning and Chain-of-Thought Distillation Feb 21, 2025 Knowledge Distillation Privacy Preserving
— Unverified 0Self-supervised Monocular Depth Estimation Robust to Reflective Surface Leveraged by Triplet Mining Feb 20, 2025 Depth Estimation Knowledge Distillation
— Unverified 0Designing Parameter and Compute Efficient Diffusion Transformers using Distillation Feb 20, 2025 Knowledge Distillation NVIDIA Jetson Orin Nano
— Unverified 0Efficient AI in Practice: Training and Deployment of Efficient LLMs for Industry Applications Feb 20, 2025 Knowledge Distillation Model Compression
— Unverified 0TimeDistill: Efficient Long-Term Time Series Forecasting with MLP via Cross-Architecture Distillation Feb 20, 2025 Data Augmentation Knowledge Distillation
— Unverified 0Vision Foundation Models in Medical Image Analysis: Advances and Challenges Feb 20, 2025 Domain Adaptation Federated Learning
— Unverified 0Modifying Final Splits of Classification Tree for Fine-tuning Subpopulation Target in Policy Making Feb 20, 2025 Knowledge Distillation
— Unverified 0Dynamic Activation with Knowledge Distillation for Energy-Efficient Spiking NN Ensembles Feb 19, 2025 Disentanglement Ensemble Learning
— Unverified 0Towards Vector Optimization on Low-Dimensional Vector Symbolic Architecture Feb 19, 2025 Knowledge Distillation
— Unverified 0Capturing Rich Behavior Representations: A Dynamic Action Semantic-Aware Graph Transformer for Video Captioning Feb 19, 2025 Knowledge Distillation Object
— Unverified 0MambaLiteSR: Image Super-Resolution with Low-Rank Mamba using Knowledge Distillation Feb 19, 2025 Image Super-Resolution Knowledge Distillation
— Unverified 0Integrating Arithmetic Learning Improves Mathematical Reasoning in Smaller Models Feb 18, 2025 Data Augmentation GSM8K
— Unverified 0Enhancing Semi-supervised Learning with Zero-shot Pseudolabels Feb 18, 2025 Knowledge Distillation
— Unverified 0NaturalReasoning: Reasoning in the Wild with 2.8M Challenging Questions Feb 18, 2025 Knowledge Distillation Math
— Unverified 0Every Expert Matters: Towards Effective Knowledge Distillation for Mixture-of-Experts Language Models Feb 18, 2025 Knowledge Distillation Mixture-of-Experts
— Unverified 0Does Training with Synthetic Data Truly Protect Privacy? Feb 18, 2025 Data-free Knowledge Distillation Dataset Distillation
Code Code Available 0Leave No One Behind: Enhancing Diversity While Maintaining Accuracy in Social Recommendation Feb 17, 2025 Diversity Knowledge Distillation
Code Code Available 0Warmup-Distill: Bridge the Distribution Mismatch between Teacher and Student before Knowledge Distillation Feb 17, 2025 Knowledge Distillation Math
Code Code Available 0Smoothing Out Hallucinations: Mitigating LLM Hallucination with Smoothed Knowledge Distillation Feb 16, 2025 Hallucination Knowledge Distillation
— Unverified 0Leveraging Conditional Mutual Information to Improve Large Language Model Fine-Tuning For Classification Feb 16, 2025 Classification image-classification
— Unverified 0LLM-driven Knowledge Distillation for Dynamic Text-Attributed Graphs Feb 15, 2025 Edge Classification Knowledge Distillation
— Unverified 0CLoCKDistill: Consistent Location-and-Context-aware Knowledge Distillation for DETRs Feb 15, 2025 Denoising Knowledge Distillation
— Unverified 0AIDE: Agentically Improve Visual Language Model with Domain Experts Feb 13, 2025 Knowledge Distillation Language Modeling
— Unverified 0LLM Pretraining with Continuous Concepts Feb 12, 2025 Knowledge Distillation Language Modeling
— Unverified 0Life-Code: Central Dogma Modeling with Multi-Omics Sequence Unification Feb 11, 2025 Knowledge Distillation
— Unverified 0OpenGrok: Enhancing SNS Data Processing with Distilled Knowledge and Mask-like Mechanisms Feb 11, 2025 Knowledge Distillation MMLU
Code Code Available 0Vision-Language Models for Edge Networks: A Comprehensive Survey Feb 11, 2025 Autonomous Vehicles Image Captioning
— Unverified 0Optimizing Knowledge Distillation in Transformers: Enabling Multi-Head Attention without Alignment Barriers Feb 11, 2025 image-classification Image Classification
— Unverified 0Progressive Collaborative and Semantic Knowledge Fusion for Generative Recommendation Feb 10, 2025 Knowledge Distillation
— Unverified 0Rationalization Models for Text-to-SQL Feb 10, 2025 Knowledge Distillation Language Modeling
— Unverified 0Right Time to Learn:Promoting Generalization via Bio-inspired Spacing Effect in Knowledge Distillation Feb 10, 2025 Knowledge Distillation
Code Code Available 0DROP: Poison Dilution via Knowledge Distillation for Federated Learning Feb 10, 2025 Data Poisoning Federated Learning
Code Code Available 0Contrastive Representation Distillation via Multi-Scale Feature Decoupling Feb 9, 2025 Knowledge Distillation Transfer Learning
— Unverified 0