AlignCap: Aligning Speech Emotion Captioning to Human Preferences Oct 24, 2024 Knowledge Distillation Language Modeling
— Unverified 0Knowledge Distillation Using Frontier Open-source LLMs: Generalizability and the Role of Synthetic Data Oct 24, 2024 Knowledge Distillation Natural Language Understanding
— Unverified 0SIKeD: Self-guided Iterative Knowledge Distillation for mathematical reasoning Oct 24, 2024 Knowledge Distillation Mathematical Reasoning
Code Code Available 0High-dimensional Analysis of Knowledge Distillation: Weak-to-Strong Generalization and Scaling Laws Oct 24, 2024 Knowledge Distillation regression
— Unverified 0Towards Active Participant-Centric Vertical Federated Learning: Some Representations May Be All You Need Oct 23, 2024 All Federated Learning
— Unverified 0ELAICHI: Enhancing Low-resource TTS by Addressing Infrequent and Low-frequency Character Bigrams Oct 23, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Towards Effective Data-Free Knowledge Distillation via Diverse Diffusion Augmentation Oct 23, 2024 Data-free Knowledge Distillation Diversity
Code Code Available 0SafetyAnalyst: Interpretable, Transparent, and Steerable Safety Moderation for AI Behavior Oct 22, 2024 Knowledge Distillation
— Unverified 0AttriPrompter: Auto-Prompting with Attribute Semantics for Zero-shot Nuclei Detection via Visual-Language Pre-trained Models Oct 22, 2024 Attribute Knowledge Distillation
Code Code Available 0CK4Gen: A Knowledge Distillation Framework for Generating High-Utility Synthetic Survival Datasets in Healthcare Oct 22, 2024 Data Augmentation Knowledge Distillation
— Unverified 0MiniPLM: Knowledge Distillation for Pre-Training Language Models Oct 22, 2024 Diversity Knowledge Distillation
Code Code Available 2Model Mimic Attack: Knowledge Distillation for Provably Transferable Adversarial Examples Oct 21, 2024 Knowledge Distillation
— Unverified 0Pre-training Distillation for Large Language Models: A Design Space Exploration Oct 21, 2024 Knowledge Distillation
— Unverified 0GSSF: Generalized Structural Sparse Function for Deep Cross-modal Metric Learning Oct 20, 2024 Image Retrieval Image-text Retrieval
Code Code Available 0LLaVA-Ultra: Large Chinese Language and Vision Assistant for Ultrasound Oct 19, 2024 Instruction Following Knowledge Distillation
— Unverified 0Improving Pronunciation and Accent Conversion through Knowledge Distillation And Synthetic Ground-Truth from Native TTS Oct 19, 2024 Knowledge Distillation
— Unverified 0Interpreting Microbiome Relative Abundance Data Using Symbolic Regression Oct 18, 2024 Diagnostic Knowledge Distillation
Code Code Available 0DiSCo: LLM Knowledge Distillation for Efficient Sparse Retrieval in Conversational Search Oct 18, 2024 Conversational Information Access Conversational Search
Code Code Available 0Preview-based Category Contrastive Learning for Knowledge Distillation Oct 18, 2024 Contrastive Learning Knowledge Distillation
— Unverified 0Unlearning Backdoor Attacks for LLMs with Weak-to-Strong Knowledge Distillation Oct 18, 2024 Backdoor Attack Knowledge Distillation
Code Code Available 0CAKD: A Correlation-Aware Knowledge Distillation Framework Based on Decoupling Kullback-Leibler Divergence Oct 17, 2024 Binary Classification Knowledge Distillation
— Unverified 0FTSmartAudit: A Knowledge Distillation-Enhanced Framework for Automated Smart Contract Auditing Using Fine-Tuned LLMs Oct 17, 2024 Dataset Generation Knowledge Distillation
— Unverified 0An Active Learning Framework for Inclusive Generation by Large Language Models Oct 17, 2024 Active Learning Clustering
— Unverified 0Towards Satellite Non-IID Imagery: A Spectral Clustering-Assisted Federated Learning Approach Oct 17, 2024 Earth Observation Federated Learning
— Unverified 0Proactive Detection and Calibration of Seasonal Advertisements with Multimodal Large Language Models Oct 16, 2024 Knowledge Distillation
— Unverified 0TransAgent: Transfer Vision-Language Foundation Models with Heterogeneous Agent Collaboration Oct 16, 2024 Knowledge Distillation Transfer Learning
Code Code Available 1Optimizing YOLOv5s Object Detection through Knowledge Distillation algorithm Oct 16, 2024 Knowledge Distillation Object
— Unverified 0SAM-Guided Masked Token Prediction for 3D Scene Understanding Oct 16, 2024 3D Object Detection Knowledge Distillation
— Unverified 0TAS: Distilling Arbitrary Teacher and Student via a Hybrid Assistant Oct 16, 2024 Knowledge Distillation Transfer Learning
— Unverified 0MoE-Pruner: Pruning Mixture-of-Experts Large Language Model using the Hints from Its Router Oct 15, 2024 Knowledge Distillation Language Modeling
— Unverified 0Learning from Imperfect Data: Towards Efficient Knowledge Distillation of Autoregressive Language Models for Text-to-SQL Oct 15, 2024 Knowledge Distillation Text to SQL
— Unverified 0Breaking Modality Gap in RGBT Tracking: Coupled Knowledge Distillation Oct 15, 2024 Knowledge Distillation Rgb-T Tracking
Code Code Available 1Speculative Knowledge Distillation: Bridging the Teacher-Student Gap Through Interleaved Sampling Oct 15, 2024 Instruction Following Knowledge Distillation
— Unverified 0Temperature-Centric Investigation of Speculative Decoding with Knowledge Distillation Oct 14, 2024 Knowledge Distillation
Code Code Available 0REHRSeg: Unleashing the Power of Self-Supervised Super-Resolution for Resource-Efficient 3D MRI Segmentation Oct 14, 2024 Knowledge Distillation Medical Image Analysis
Code Code Available 0ROSAR: An Adversarial Re-Training Framework for Robust Side-Scan Sonar Object Detection Oct 14, 2024 Knowledge Distillation object-detection
Code Code Available 0Large Model for Small Data: Foundation Model for Cross-Modal RF Human Activity Recognition Oct 13, 2024 Activity Recognition Few-Shot Learning
— Unverified 0Distilling Invariant Representations with Dual Augmentation Oct 12, 2024 Knowledge Distillation
— Unverified 0Declarative Knowledge Distillation from Large Language Models for Visual Question Answering Datasets Oct 12, 2024 Knowledge Distillation Question Answering
Code Code Available 0Mentor-KD: Making Small Language Models Better Multi-step Reasoners Oct 11, 2024 Knowledge Distillation
Code Code Available 1Transforming In-Vehicle Network Intrusion Detection: VAE-based Knowledge Distillation Meets Explainable AI Oct 11, 2024 Autonomous Vehicles Intrusion Detection
— Unverified 0GAI-Enabled Explainable Personalized Federated Semi-Supervised Learning Oct 11, 2024 Federated Learning Knowledge Distillation
— Unverified 0Simultaneous Reward Distillation and Preference Learning: Get You a Language Model Who Can Do Both Oct 11, 2024 Knowledge Distillation Language Modeling
— Unverified 0What is Left After Distillation? How Knowledge Transfer Impacts Fairness and Bias Oct 10, 2024 Age/Unbiased Fairness
— Unverified 0Relational Diffusion Distillation for Efficient Image Generation Oct 10, 2024 Image Generation Knowledge Distillation
Code Code Available 0SNN-PAR: Energy Efficient Pedestrian Attribute Recognition via Spiking Neural Networks Oct 10, 2024 Attribute Knowledge Distillation
— Unverified 0A Lightweight Target-Driven Network of Stereo Matching for Inland Waterways Oct 10, 2024 Autonomous Navigation Knowledge Distillation
Code Code Available 0Unlocking Real-Time Fluorescence Lifetime Imaging: Multi-Pixel Parallelism for FPGA-Accelerated Processing Oct 9, 2024 Knowledge Distillation Scheduling
— Unverified 0S2HPruner: Soft-to-Hard Distillation Bridges the Discretization Gap in Pruning Oct 9, 2024 Knowledge Distillation
— Unverified 0Structure-Centric Robust Monocular Depth Estimation via Knowledge Distillation Oct 9, 2024 Depth Estimation Knowledge Distillation
— Unverified 0