Fast Real-time Personalized Speech Enhancement: End-to-End Enhancement Network (E3Net) and Knowledge Distillation Apr 2, 2022 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Fast Sampling Through The Reuse Of Attention Maps In Diffusion Models Dec 13, 2023 Image Generation Knowledge Distillation
— Unverified 0FastSR-NeRF: Improving NeRF Efficiency on Consumer Devices with A Simple Super-Resolution Pipeline Dec 15, 2023 GPU Knowledge Distillation
— Unverified 0Fast Streaming Transducer ASR Prototyping via Knowledge Distillation with Whisper Sep 20, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Fast Video Salient Object Detection via Spatiotemporal Knowledge Distillation Oct 20, 2020 Knowledge Distillation Object
— Unverified 0Feature Adversarial Distillation for Point Cloud Classification Jun 25, 2023 Classification FAD
— Unverified 0Feature Affinity Assisted Knowledge Distillation and Quantization of Deep Neural Networks on Label-Free Data Feb 10, 2023 Knowledge Distillation Quantization
— Unverified 0Feature Alignment and Representation Transfer in Knowledge Distillation for Large Language Models Apr 18, 2025 image-classification Image Classification
— Unverified 0Feature Alignment-Based Knowledge Distillation for Efficient Compression of Large Language Models Dec 27, 2024 Knowledge Distillation Model Compression
— Unverified 0Feature-Align Network with Knowledge Distillation for Efficient Denoising Mar 2, 2021 Decoder Denoising
— Unverified 0Feature-domain Adaptive Contrastive Distillation for Efficient Single Image Super-Resolution Nov 29, 2022 Image Super-Resolution Knowledge Distillation
— Unverified 0Feature-based One-For-All: A Universal Framework for Heterogeneous Knowledge Distillation Jan 15, 2025 All Knowledge Distillation
— Unverified 0Feature Correlation-guided Knowledge Transfer for Federated Self-supervised Learning Nov 14, 2022 Feature Correlation Federated Learning
— Unverified 0Feature Distillation is the Better Choice for Model-Heterogeneous Federated Learning Jul 14, 2025 Federated Learning Knowledge Distillation
— Unverified 0Feature Fusion and Knowledge-Distilled Multi-Modal Multi-Target Detection May 31, 2025 Domain Adaptation Knowledge Distillation
— Unverified 0Feature Interaction Fusion Self-Distillation Network For CTR Prediction Nov 12, 2024 Click-Through Rate Prediction Knowledge Distillation
— Unverified 0Feature Kernel Distillation Sep 29, 2021 image-classification Image Classification
— Unverified 0Feature-map-level Online Adversarial Knowledge Distillation Feb 5, 2020 Knowledge Distillation
— Unverified 0Feature-Rich Audio Model Inversion for Data-Free Knowledge Distillation Towards General Sound Classification Mar 14, 2023 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 0Feature Structure Distillation for BERT Transferring Nov 16, 2021 Knowledge Distillation
— Unverified 0FedAL: Black-Box Federated Knowledge Distillation Enabled by Adversarial Learning Nov 28, 2023 Knowledge Distillation Transfer Learning
— Unverified 0FedD2S: Personalized Data-Free Federated Knowledge Distillation Feb 16, 2024 Data-free Knowledge Distillation Fairness
— Unverified 0FedDKD: Federated Learning with Decentralized Knowledge Distillation May 2, 2022 Federated Learning Knowledge Distillation
— Unverified 0FedDTG:Federated Data-Free Knowledge Distillation via Three-Player Generative Adversarial Networks Jan 10, 2022 Data-free Knowledge Distillation Federated Learning
— Unverified 0FedED: Federated Learning via Ensemble Distillation for Medical Relation Extraction Nov 1, 2020 Federated Learning Knowledge Distillation
— Unverified 0FedEFM: Federated Endovascular Foundation Model with Unseen Data Jan 28, 2025 Federated Learning Knowledge Distillation
— Unverified 0Federated Action Recognition on Heterogeneous Embedded Devices Jul 18, 2021 Action Recognition Federated Learning
— Unverified 0Federated Bayesian Neural Regression: A Scalable Global Federated Gaussian Process Jun 13, 2022 Federated Learning Knowledge Distillation
— Unverified 0Federated Deconfounding and Debiasing Learning for Out-of-Distribution Generalization May 8, 2025 Attribute Benchmarking
— Unverified 0Federated Distillation: A Survey Apr 2, 2024 Federated Learning Knowledge Distillation
— Unverified 0Federated Ensemble Model-based Reinforcement Learning in Edge Computing Sep 12, 2021 Autonomous Driving continuous-control
— Unverified 0Federated Fine-Tuning of LLMs: Framework Comparison and Research Directions Jan 8, 2025 Federated Learning Knowledge Distillation
— Unverified 0Federated Graph Learning with Graphless Clients Nov 13, 2024 Graph Learning Knowledge Distillation
— Unverified 0Federated Knowledge Transfer Fine-tuning Large Server Model with Resource-Constrained IoT Clients Jul 7, 2024 Federated Learning Knowledge Distillation
— Unverified 0Federated Learning for Data and Model Heterogeneity in Medical Imaging Jul 31, 2023 Federated Learning Knowledge Distillation
— Unverified 0Federated Learning on Non-iid Data via Local and Global Distillation Jun 26, 2023 Federated Learning Knowledge Distillation
— Unverified 0Federated Learning with Privacy-Preserving Ensemble Attention Distillation Oct 16, 2022 Federated Learning image-classification
— Unverified 0Federated One-Shot Learning with Data Privacy and Objective-Hiding Apr 29, 2025 Federated Learning Information Retrieval
— Unverified 0Federated Semi-Supervised Domain Adaptation via Knowledge Transfer Jul 21, 2022 Domain Adaptation Federated Learning
— Unverified 0Federated Unlearning with Knowledge Distillation Jan 24, 2022 Federated Learning Knowledge Distillation
— Unverified 0FedKD: Communication Efficient Federated Learning via Knowledge Distillation Aug 30, 2021 Federated Learning Knowledge Distillation
— Unverified 0Exploiting Label Skewness for Spiking Neural Networks in Federated Learning Dec 23, 2024 Federated Learning Knowledge Distillation
— Unverified 0FedQUIT: On-Device Federated Unlearning via a Quasi-Competent Virtual Teacher Aug 14, 2024 Federated Learning Knowledge Distillation
— Unverified 0FedRAD: Federated Robust Adaptive Distillation Dec 2, 2021 Federated Learning Knowledge Distillation
— Unverified 0FedSDD: Scalable and Diversity-enhanced Distillation for Model Aggregation in Federated Learning Dec 28, 2023 Diversity Federated Learning
— Unverified 0FedSKD: Aggregation-free Model-heterogeneous Federated Learning using Multi-dimensional Similarity Knowledge Distillation Mar 23, 2025 Federated Learning Knowledge Distillation
— Unverified 0FedSPLIT: One-Shot Federated Recommendation System Based on Non-negative Joint Matrix Factorization and Knowledge Distillation May 4, 2022 Collaborative Filtering Federated Learning
— Unverified 0FedTAD: Topology-aware Data-free Knowledge Distillation for Subgraph Federated Learning Apr 22, 2024 Data-free Knowledge Distillation Federated Learning
— Unverified 0FedUD: Exploiting Unaligned Data for Cross-Platform Federated Click-Through Rate Prediction Jul 26, 2024 Click-Through Rate Prediction Federated Learning
— Unverified 0FEED: Feature-level Ensemble Effect for knowledge Distillation May 1, 2019 Knowledge Distillation Transfer Learning
— Unverified 0