Feature Kernel Distillation Sep 29, 2021 image-classification Image Classification
— Unverified 00 Feature-map-level Online Adversarial Knowledge Distillation Feb 5, 2020 Knowledge Distillation
— Unverified 00 Feature-Rich Audio Model Inversion for Data-Free Knowledge Distillation Towards General Sound Classification Mar 14, 2023 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 00 Feature Structure Distillation for BERT Transferring Nov 16, 2021 Knowledge Distillation
— Unverified 00 FedAL: Black-Box Federated Knowledge Distillation Enabled by Adversarial Learning Nov 28, 2023 Knowledge Distillation Transfer Learning
— Unverified 00 FedD2S: Personalized Data-Free Federated Knowledge Distillation Feb 16, 2024 Data-free Knowledge Distillation Fairness
— Unverified 00 FedDKD: Federated Learning with Decentralized Knowledge Distillation May 2, 2022 Federated Learning Knowledge Distillation
— Unverified 00 FedDTG:Federated Data-Free Knowledge Distillation via Three-Player Generative Adversarial Networks Jan 10, 2022 Data-free Knowledge Distillation Federated Learning
— Unverified 00 FedED: Federated Learning via Ensemble Distillation for Medical Relation Extraction Nov 1, 2020 Federated Learning Knowledge Distillation
— Unverified 00 FedEFM: Federated Endovascular Foundation Model with Unseen Data Jan 28, 2025 Federated Learning Knowledge Distillation
— Unverified 00 Federated Action Recognition on Heterogeneous Embedded Devices Jul 18, 2021 Action Recognition Federated Learning
— Unverified 00 Federated Bayesian Neural Regression: A Scalable Global Federated Gaussian Process Jun 13, 2022 Federated Learning Knowledge Distillation
— Unverified 00 Federated Deconfounding and Debiasing Learning for Out-of-Distribution Generalization May 8, 2025 Attribute Benchmarking
— Unverified 00 Federated Distillation: A Survey Apr 2, 2024 Federated Learning Knowledge Distillation
— Unverified 00 Federated Ensemble Model-based Reinforcement Learning in Edge Computing Sep 12, 2021 Autonomous Driving continuous-control
— Unverified 00 Federated Fine-Tuning of LLMs: Framework Comparison and Research Directions Jan 8, 2025 Federated Learning Knowledge Distillation
— Unverified 00 Federated Graph Learning with Graphless Clients Nov 13, 2024 Graph Learning Knowledge Distillation
— Unverified 00 Federated Knowledge Transfer Fine-tuning Large Server Model with Resource-Constrained IoT Clients Jul 7, 2024 Federated Learning Knowledge Distillation
— Unverified 00 Federated Learning for Data and Model Heterogeneity in Medical Imaging Jul 31, 2023 Federated Learning Knowledge Distillation
— Unverified 00 Federated Learning on Non-iid Data via Local and Global Distillation Jun 26, 2023 Federated Learning Knowledge Distillation
— Unverified 00 Federated Learning with Privacy-Preserving Ensemble Attention Distillation Oct 16, 2022 Federated Learning image-classification
— Unverified 00 Federated One-Shot Learning with Data Privacy and Objective-Hiding Apr 29, 2025 Federated Learning Information Retrieval
— Unverified 00 Federated Semi-Supervised Domain Adaptation via Knowledge Transfer Jul 21, 2022 Domain Adaptation Federated Learning
— Unverified 00 Federated Unlearning with Knowledge Distillation Jan 24, 2022 Federated Learning Knowledge Distillation
— Unverified 00 FedKD: Communication Efficient Federated Learning via Knowledge Distillation Aug 30, 2021 Federated Learning Knowledge Distillation
— Unverified 00 Exploiting Label Skewness for Spiking Neural Networks in Federated Learning Dec 23, 2024 Federated Learning Knowledge Distillation
— Unverified 00 FedQUIT: On-Device Federated Unlearning via a Quasi-Competent Virtual Teacher Aug 14, 2024 Federated Learning Knowledge Distillation
— Unverified 00 FedRAD: Federated Robust Adaptive Distillation Dec 2, 2021 Federated Learning Knowledge Distillation
— Unverified 00 FedSDD: Scalable and Diversity-enhanced Distillation for Model Aggregation in Federated Learning Dec 28, 2023 Diversity Federated Learning
— Unverified 00 FedSKD: Aggregation-free Model-heterogeneous Federated Learning using Multi-dimensional Similarity Knowledge Distillation Mar 23, 2025 Federated Learning Knowledge Distillation
— Unverified 00 FedSPLIT: One-Shot Federated Recommendation System Based on Non-negative Joint Matrix Factorization and Knowledge Distillation May 4, 2022 Collaborative Filtering Federated Learning
— Unverified 00 FedTAD: Topology-aware Data-free Knowledge Distillation for Subgraph Federated Learning Apr 22, 2024 Data-free Knowledge Distillation Federated Learning
— Unverified 00 FedUD: Exploiting Unaligned Data for Cross-Platform Federated Click-Through Rate Prediction Jul 26, 2024 Click-Through Rate Prediction Federated Learning
— Unverified 00 FEED: Feature-level Ensemble Effect for knowledge Distillation May 1, 2019 Knowledge Distillation Transfer Learning
— Unverified 00 FEED: Feature-level Ensemble for Knowledge Distillation Sep 24, 2019 Knowledge Distillation
— Unverified 00 Few-shot 3D LiDAR Semantic Segmentation for Autonomous Driving Feb 17, 2023 Autonomous Driving Few-Shot Learning
— Unverified 00 Few-shot Face Image Translation via GAN Prior Distillation Jan 28, 2023 Knowledge Distillation Translation
— Unverified 00 Few-shot learning of neural networks from scratch by pseudo example optimization Feb 8, 2018 Few-Shot Learning Knowledge Distillation
— Unverified 00 Few-Shot Object Detection by Knowledge Distillation Using Bag-of-Visual-Words Representations Jul 25, 2022 Few-Shot Object Detection Knowledge Distillation
— Unverified 00 FGAD: Self-boosted Knowledge Distillation for An Effective Federated Graph Anomaly Detection Framework Feb 20, 2024 Anomaly Detection Federated Learning
— Unverified 00 A methodology for training homomorphicencryption friendly neural networks Nov 5, 2021 Knowledge Distillation Privacy Preserving
— Unverified 00 FiGKD: Fine-Grained Knowledge Distillation via High-Frequency Detail Transfer May 17, 2025 Fine-Grained Visual Recognition Knowledge Distillation
— Unverified 00 Fine-Grained Distillation for Long Document Retrieval Dec 20, 2022 Knowledge Distillation Retrieval
— Unverified 00 Fine-grained Image Retrieval via Dual-Vision Adaptation Jun 19, 2025 Image Retrieval Knowledge Distillation
— Unverified 00 Fine-tune Before Structured Pruning: Towards Compact and Accurate Self-Supervised Models for Speaker Diarization May 30, 2025 GPU Knowledge Distillation
— Unverified 00 Fine-tuning a Multiple Instance Learning Feature Extractor with Masked Context Modelling and Knowledge Distillation Mar 8, 2024 Image Generation Knowledge Distillation
— Unverified 00 Boosting Pruned Networks with Linear Over-parameterization Apr 25, 2022 Knowledge Distillation
— Unverified 00 Fixing the Teacher-Student Knowledge Discrepancy in Distillation Mar 31, 2021 image-classification Image Classification
— Unverified 00 FLAR: A Unified Prototype Framework for Few-Sample Lifelong Active Recognition Jan 1, 2021 Knowledge Distillation Lifelong learning
— Unverified 00 FlyKD: Graph Knowledge Distillation on the Fly with Curriculum Learning Mar 16, 2024 Knowledge Distillation
— Unverified 00