Single-stage TTS with Masked Audio Token Modeling and Semantic Knowledge Distillation Sep 17, 2024 Knowledge Distillation Speech Synthesis
— Unverified 0Unleashing the Potential of Mamba: Boosting a LiDAR 3D Sparse Detector by Using Cross-Model Knowledge Distillation Sep 17, 2024 3D Object Detection Autonomous Driving
— Unverified 0Time-Series Forecasting, Knowledge Distillation, and Refinement within a Multimodal PDE Foundation Model Sep 17, 2024 Knowledge Distillation Operator learning
Code Code Available 0Human Insights Driven Latent Space for Different Driving Perspectives: A Unified Encoder for Efficient Multi-Task Inference Sep 16, 2024 Autonomous Driving Knowledge Distillation
— Unverified 0Frequency-Guided Masking for Enhanced Vision Self-Supervised Learning Sep 16, 2024 Few-Shot Learning image-classification
Code Code Available 0Integrated Multi-Level Knowledge Distillation for Enhanced Speaker Verification Sep 14, 2024 Knowledge Distillation Speaker Verification
— Unverified 0Joint Semantic Knowledge Distillation and Masked Acoustic Modeling for Full-band Speech Restoration with Improved Intelligibility Sep 14, 2024 Knowledge Distillation Language Modeling
— Unverified 0AWF: Adaptive Weight Fusion for Enhanced Class Incremental Semantic Segmentation Sep 13, 2024 Class-Incremental Semantic Segmentation Knowledge Distillation
— Unverified 0DiReDi: Distillation and Reverse Distillation for AIoT Applications Sep 12, 2024 Knowledge Distillation Management
— Unverified 0Learn from Balance: Rectifying Knowledge Transfer for Long-Tailed Scenarios Sep 12, 2024 Knowledge Distillation Transfer Learning
— Unverified 0Enhancing CTC-Based Visual Speech Recognition Sep 11, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0DS-ViT: Dual-Stream Vision Transformer for Cross-Task Distillation in Alzheimer's Early Diagnosis Sep 11, 2024 Classification Knowledge Distillation
— Unverified 0Privacy-Preserving Federated Learning with Consistency via Knowledge Distillation Using Conditional Generator Sep 11, 2024 Diversity Federated Learning
— Unverified 0A Continual and Incremental Learning Approach for TinyML On-device Training Using Dataset Distillation and Model Size Adaption Sep 11, 2024 Anomaly Detection Computational Efficiency
— Unverified 0How Redundant Is the Transformer Stack in Speech Representation Models? Sep 10, 2024 Knowledge Distillation Speaker Identification
— Unverified 0Knowledge Distillation via Query Selection for Detection Transformer Sep 10, 2024 Knowledge Distillation object-detection
— Unverified 0Applied Federated Model Personalisation in the Industrial Domain: A Comparative Study Sep 10, 2024 Active Learning Federated Learning
— Unverified 0Distilling Generative-Discriminative Representations for Very Low-Resolution Face Recognition Sep 10, 2024 Face Recognition Knowledge Distillation
— Unverified 0Look One and More: Distilling Hybrid Order Relational Knowledge for Cross-Resolution Image Recognition Sep 9, 2024 Face Recognition image-classification
— Unverified 0Joint Input and Output Coordination for Class-Incremental Learning Sep 9, 2024 class-incremental learning Class Incremental Learning
— Unverified 0FedBrain-Distill: Communication-Efficient Federated Brain Tumor Classification Using Ensemble Knowledge Distillation on Non-IID Data Sep 9, 2024 Brain Tumor Classification Federated Learning
Code Code Available 0Complex Emotion Recognition System using basic emotions via Facial Expression, EEG, and ECG Signals: a review Sep 9, 2024 EEG Electroencephalogram (EEG)
— Unverified 0LoCa: Logit Calibration for Knowledge Distillation Sep 7, 2024 image-classification Image Classification
— Unverified 0SCARF: Scalable Continual Learning Framework for Memory-efficient Multiple Neural Radiance Fields Sep 6, 2024 Continual Learning Knowledge Distillation
— Unverified 0Data-free Distillation with Degradation-prompt Diffusion for Multi-weather Image Restoration Sep 5, 2024 Image Restoration Knowledge Distillation
— Unverified 0Experimentation in Content Moderation using RWKV Sep 5, 2024 CPU Knowledge Distillation
— Unverified 0Low-Resolution Object Recognition with Cross-Resolution Relational Contrastive Distillation Sep 4, 2024 Face Recognition Knowledge Distillation
— Unverified 0Sorbet: A Neuromorphic Hardware-Compatible Transformer-Based Spiking Language Model Sep 4, 2024 Knowledge Distillation Language Modeling
— Unverified 0Non-target Divergence Hypothesis: Toward Understanding Domain Gaps in Cross-Modal Knowledge Distillation Sep 4, 2024 Knowledge Distillation
— Unverified 0Efficient Image Compression Using Advanced State Space Models Sep 4, 2024 Computational Efficiency Image Compression
— Unverified 0Collaborative Learning for Enhanced Unsupervised Domain Adaptation Sep 4, 2024 Domain Adaptation Knowledge Distillation
— Unverified 0Adaptive Explicit Knowledge Transfer for Knowledge Distillation Sep 3, 2024 Knowledge Distillation Transfer Learning
— Unverified 0Improving Apple Object Detection with Occlusion-Enhanced Distillation Sep 3, 2024 Knowledge Distillation Object
— Unverified 0Low-Resolution Face Recognition via Adaptable Instance-Relation Distillation Sep 3, 2024 Face Recognition Knowledge Distillation
— Unverified 0Efficient Point Cloud Classification via Offline Distillation Framework and Negative-Weight Self-Distillation Technique Sep 3, 2024 Data Augmentation Knowledge Distillation
— Unverified 0Compressing VAE-Based Out-of-Distribution Detectors for Embedded Deployment Sep 2, 2024 CPU GPU
— Unverified 0HiTSR: A Hierarchical Transformer for Reference-based Super-Resolution Aug 30, 2024 Image Super-Resolution Knowledge Distillation
Code Code Available 0How Knowledge Distillation Mitigates the Synthetic Gap in Fair Face Recognition Aug 30, 2024 Face Recognition Fairness
Code Code Available 0MedDet: Generative Adversarial Distillation for Efficient Cervical Disc Herniation Detection Aug 30, 2024 Knowledge Distillation Model Compression
Code Code Available 0MST-KD: Multiple Specialized Teachers Knowledge Distillation for Fair Face Recognition Aug 29, 2024 Face Recognition Knowledge Distillation
Code Code Available 0Smaller, Weaker, Yet Better: Training LLM Reasoners via Compute-Optimal Sampling Aug 29, 2024 Diversity Knowledge Distillation
— Unverified 0VLM-KD: Knowledge Distillation from VLM for Long-Tail Visual Recognition Aug 29, 2024 Knowledge Distillation Language Modeling
— Unverified 0Boosting Lossless Speculative Decoding via Feature Sampling and Partial Alignment Distillation Aug 28, 2024 Knowledge Distillation Language Modelling
— Unverified 0Online pre-training with long-form videos Aug 28, 2024 Action Recognition Contrastive Learning
— Unverified 0ModalityMirror: Improving Audio Classification in Modality Heterogeneity Federated Learning with Multimodal Distillation Aug 28, 2024 Audio Classification Federated Learning
— Unverified 0Bridging the Gap: Unpacking the Hidden Challenges in Knowledge Distillation for Online Ranking Systems Aug 26, 2024 Knowledge Distillation Recommendation Systems
— Unverified 0Let Video Teaches You More: Video-to-Image Knowledge Distillation using DEtection TRansformer for Medical Video Lesion Detection Aug 26, 2024 Knowledge Distillation Lesion Detection
— Unverified 0TSAK: Two-Stage Semantic-Aware Knowledge Distillation for Efficient Wearable Modality and Model Optimization in Manufacturing Lines Aug 26, 2024 Activity Recognition Human Activity Recognition
— Unverified 0On-Device Language Models: A Comprehensive Review Aug 26, 2024 Knowledge Distillation Quantization
Code Code Available 0Bring the Power of Diffusion Model to Defect Detection Aug 25, 2024 Defect Detection Denoising
— Unverified 0