Improving Pronunciation and Accent Conversion through Knowledge Distillation And Synthetic Ground-Truth from Native TTS Oct 19, 2024 Knowledge Distillation
— Unverified 0DiSCo: LLM Knowledge Distillation for Efficient Sparse Retrieval in Conversational Search Oct 18, 2024 Conversational Information Access Conversational Search
Code Code Available 0Interpreting Microbiome Relative Abundance Data Using Symbolic Regression Oct 18, 2024 Diagnostic Knowledge Distillation
Code Code Available 0Unlearning Backdoor Attacks for LLMs with Weak-to-Strong Knowledge Distillation Oct 18, 2024 Backdoor Attack Knowledge Distillation
Code Code Available 0Preview-based Category Contrastive Learning for Knowledge Distillation Oct 18, 2024 Contrastive Learning Knowledge Distillation
— Unverified 0An Active Learning Framework for Inclusive Generation by Large Language Models Oct 17, 2024 Active Learning Clustering
— Unverified 0Towards Satellite Non-IID Imagery: A Spectral Clustering-Assisted Federated Learning Approach Oct 17, 2024 Earth Observation Federated Learning
— Unverified 0FTSmartAudit: A Knowledge Distillation-Enhanced Framework for Automated Smart Contract Auditing Using Fine-Tuned LLMs Oct 17, 2024 Dataset Generation Knowledge Distillation
— Unverified 0CAKD: A Correlation-Aware Knowledge Distillation Framework Based on Decoupling Kullback-Leibler Divergence Oct 17, 2024 Binary Classification Knowledge Distillation
— Unverified 0Optimizing YOLOv5s Object Detection through Knowledge Distillation algorithm Oct 16, 2024 Knowledge Distillation Object
— Unverified 0TAS: Distilling Arbitrary Teacher and Student via a Hybrid Assistant Oct 16, 2024 Knowledge Distillation Transfer Learning
— Unverified 0SAM-Guided Masked Token Prediction for 3D Scene Understanding Oct 16, 2024 3D Object Detection Knowledge Distillation
— Unverified 0Proactive Detection and Calibration of Seasonal Advertisements with Multimodal Large Language Models Oct 16, 2024 Knowledge Distillation
— Unverified 0Learning from Imperfect Data: Towards Efficient Knowledge Distillation of Autoregressive Language Models for Text-to-SQL Oct 15, 2024 Knowledge Distillation Text to SQL
— Unverified 0Speculative Knowledge Distillation: Bridging the Teacher-Student Gap Through Interleaved Sampling Oct 15, 2024 Instruction Following Knowledge Distillation
— Unverified 0MoE-Pruner: Pruning Mixture-of-Experts Large Language Model using the Hints from Its Router Oct 15, 2024 Knowledge Distillation Language Modeling
— Unverified 0Temperature-Centric Investigation of Speculative Decoding with Knowledge Distillation Oct 14, 2024 Knowledge Distillation
Code Code Available 0REHRSeg: Unleashing the Power of Self-Supervised Super-Resolution for Resource-Efficient 3D MRI Segmentation Oct 14, 2024 Knowledge Distillation Medical Image Analysis
Code Code Available 0ROSAR: An Adversarial Re-Training Framework for Robust Side-Scan Sonar Object Detection Oct 14, 2024 Knowledge Distillation object-detection
Code Code Available 0Large Model for Small Data: Foundation Model for Cross-Modal RF Human Activity Recognition Oct 13, 2024 Activity Recognition Few-Shot Learning
— Unverified 0Declarative Knowledge Distillation from Large Language Models for Visual Question Answering Datasets Oct 12, 2024 Knowledge Distillation Question Answering
Code Code Available 0Distilling Invariant Representations with Dual Augmentation Oct 12, 2024 Knowledge Distillation
— Unverified 0Simultaneous Reward Distillation and Preference Learning: Get You a Language Model Who Can Do Both Oct 11, 2024 Knowledge Distillation Language Modeling
— Unverified 0GAI-Enabled Explainable Personalized Federated Semi-Supervised Learning Oct 11, 2024 Federated Learning Knowledge Distillation
— Unverified 0Transforming In-Vehicle Network Intrusion Detection: VAE-based Knowledge Distillation Meets Explainable AI Oct 11, 2024 Autonomous Vehicles Intrusion Detection
— Unverified 0A Lightweight Target-Driven Network of Stereo Matching for Inland Waterways Oct 10, 2024 Autonomous Navigation Knowledge Distillation
Code Code Available 0Relational Diffusion Distillation for Efficient Image Generation Oct 10, 2024 Image Generation Knowledge Distillation
Code Code Available 0What is Left After Distillation? How Knowledge Transfer Impacts Fairness and Bias Oct 10, 2024 Age/Unbiased Fairness
— Unverified 0SNN-PAR: Energy Efficient Pedestrian Attribute Recognition via Spiking Neural Networks Oct 10, 2024 Attribute Knowledge Distillation
— Unverified 0Unlocking Real-Time Fluorescence Lifetime Imaging: Multi-Pixel Parallelism for FPGA-Accelerated Processing Oct 9, 2024 Knowledge Distillation Scheduling
— Unverified 0Efficient and Robust Knowledge Distillation from A Stronger Teacher Based on Correlation Matching Oct 9, 2024 Knowledge Distillation Neural Network Compression
— Unverified 0S2HPruner: Soft-to-Hard Distillation Bridges the Discretization Gap in Pruning Oct 9, 2024 Knowledge Distillation
— Unverified 0Structure-Centric Robust Monocular Depth Estimation via Knowledge Distillation Oct 9, 2024 Depth Estimation Knowledge Distillation
— Unverified 0KnowledgeSG: Privacy-Preserving Synthetic Text Generation with Knowledge Distillation from Server Oct 8, 2024 Federated Learning Knowledge Distillation
Code Code Available 0ReasoningRank: Teaching Student Models to Rank through Reasoning-Based Knowledge Distillation Oct 7, 2024 Decision Making Information Retrieval
— Unverified 0Progressive distillation induces an implicit curriculum Oct 7, 2024 Knowledge Distillation
— Unverified 0DAdEE: Unsupervised Domain Adaptation in Early Exit PLMs Oct 6, 2024 Domain Adaptation Knowledge Distillation
Code Code Available 0CAPEEN: Image Captioning with Early Exits and Knowledge Distillation Oct 6, 2024 Descriptive Image Captioning
Code Code Available 0DiDOTS: Knowledge Distillation from Large-Language-Models for Dementia Obfuscation in Transcribed Speech Oct 5, 2024 Hallucination Knowledge Distillation
— Unverified 0Accelerating Diffusion Models with One-to-Many Knowledge Distillation Oct 5, 2024 Image Generation Knowledge Distillation
— Unverified 0Gap Preserving Distillation by Building Bidirectional Mappings with A Dynamic Teacher Oct 5, 2024 Knowledge Distillation
— Unverified 0Self-Supervised Keypoint Detection with Distilled Depth Keypoint Representation Oct 4, 2024 Keypoint Detection Knowledge Distillation
— Unverified 0DocKD: Knowledge Distillation from LLMs for Open-World Document Understanding Models Oct 4, 2024 document understanding Knowledge Distillation
— Unverified 0Dataset Distillation via Knowledge Distillation: Towards Efficient Self-Supervised Pre-Training of Deep Networks Oct 3, 2024 Dataset Distillation Knowledge Distillation
Code Code Available 0BLEND: Behavior-guided Neural Population Dynamics Modeling via Privileged Knowledge Distillation Oct 2, 2024 Knowledge Distillation Time Series Analysis
Code Code Available 0Foldable SuperNets: Scalable Merging of Transformers with Different Initializations and Tasks Oct 2, 2024 Knowledge Distillation
Code Code Available 0PHI-S: Distribution Balancing for Label-Free Multi-Teacher Distillation Oct 2, 2024 Knowledge Distillation
— Unverified 0"No Matter What You Do": Purifying GNN Models via Backdoor Unlearning Oct 2, 2024 Backdoor Attack backdoor defense
Code Code Available 0AMR-Evol: Adaptive Modular Response Evolution Elicits Better Knowledge Distillation for Large Language Models in Code Generation Oct 1, 2024 Code Generation HumanEval
Code Code Available 0Self-Updatable Large Language Models with Parameter Integration Oct 1, 2024 Continual Learning Conversational Recommendation
— Unverified 0