GOVERN: Gradient Orientation Vote Ensemble for Multi-Teacher Reinforced Distillation May 6, 2024 Knowledge Distillation Question Answering
— Unverified 0Enhancing CTC-Based Visual Speech Recognition Sep 11, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Compressing Recurrent Neural Networks for FPGA-accelerated Implementation in Fluorescence Lifetime Imaging Oct 1, 2024 Computational Efficiency Knowledge Distillation
— Unverified 0Feature Adversarial Distillation for Point Cloud Classification Jun 25, 2023 Classification FAD
— Unverified 0Feature Affinity Assisted Knowledge Distillation and Quantization of Deep Neural Networks on Label-Free Data Feb 10, 2023 Knowledge Distillation Quantization
— Unverified 0Feature Alignment and Representation Transfer in Knowledge Distillation for Large Language Models Apr 18, 2025 image-classification Image Classification
— Unverified 0Feature Alignment-Based Knowledge Distillation for Efficient Compression of Large Language Models Dec 27, 2024 Knowledge Distillation Model Compression
— Unverified 0Feature-Align Network with Knowledge Distillation for Efficient Denoising Mar 2, 2021 Decoder Denoising
— Unverified 0Feature-domain Adaptive Contrastive Distillation for Efficient Single Image Super-Resolution Nov 29, 2022 Image Super-Resolution Knowledge Distillation
— Unverified 0Feature-based One-For-All: A Universal Framework for Heterogeneous Knowledge Distillation Jan 15, 2025 All Knowledge Distillation
— Unverified 0Feature Correlation-guided Knowledge Transfer for Federated Self-supervised Learning Nov 14, 2022 Feature Correlation Federated Learning
— Unverified 0Feature Distillation is the Better Choice for Model-Heterogeneous Federated Learning Jul 14, 2025 Federated Learning Knowledge Distillation
— Unverified 0Enhancing Content Representation for AR Image Quality Assessment Using Knowledge Distillation Dec 8, 2024 Image Quality Assessment Knowledge Distillation
— Unverified 0Correlation-Decoupled Knowledge Distillation for Multimodal Sentiment Analysis with Incomplete Modalities Apr 25, 2024 Disentanglement Knowledge Distillation
— Unverified 0Feature Interaction Fusion Self-Distillation Network For CTR Prediction Nov 12, 2024 Click-Through Rate Prediction Knowledge Distillation
— Unverified 0Feature Kernel Distillation Sep 29, 2021 image-classification Image Classification
— Unverified 0Enhancing Chinese Multi-Label Text Classification Performance with Response-based Knowledge Distillation Nov 1, 2022 Knowledge Distillation Multi Label Text Classification
— Unverified 0A Technical Study into Small Reasoning Language Models Jun 16, 2025 Code Generation Computational Efficiency
— Unverified 0Cost-effective Deployment of BERT Models in Serverless Environment Mar 19, 2021 Knowledge Distillation Semantic Textual Similarity
— Unverified 0Adapting OC20-trained EquiformerV2 Models for High-Entropy Materials Mar 14, 2024 Knowledge Distillation
— Unverified 0Feature-Rich Audio Model Inversion for Data-Free Knowledge Distillation Towards General Sound Classification Mar 14, 2023 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 0Feature Structure Distillation for BERT Transferring Nov 16, 2021 Knowledge Distillation
— Unverified 0Enhancing Adversarial Training with Prior Knowledge Distillation for Robust Image Compression Mar 11, 2024 Backdoor Attack Image Compression
— Unverified 0Compressing Image-to-Image Translation GANs Using Local Density Structures on Their Learned Manifold Dec 22, 2023 Density Estimation Image-to-Image Translation
— Unverified 0Compressing GANs using Knowledge Distillation Feb 1, 2019 Knowledge Distillation Super-Resolution
— Unverified 0FedAL: Black-Box Federated Knowledge Distillation Enabled by Adversarial Learning Nov 28, 2023 Knowledge Distillation Transfer Learning
— Unverified 0CoT-Drive: Efficient Motion Forecasting for Autonomous Driving with LLMs and Chain-of-Thought Prompting Mar 10, 2025 Autonomous Driving Knowledge Distillation
— Unverified 0Enhancing Action Recognition from Low-Quality Skeleton Data via Part-Level Knowledge Distillation Apr 28, 2024 Action Recognition General Knowledge
— Unverified 0A Generalized and Robust Method Towards Practical Gaze Estimation on Smart Phone Oct 16, 2019 Gaze Estimation Knowledge Distillation
— Unverified 0GOLD: Generalized Knowledge Distillation via Out-of-Distribution-Guided Language Data Generation Mar 28, 2024 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 0Enhancing Accuracy and Parameter-Efficiency of Neural Representations for Network Parameterization Jun 29, 2024 Knowledge Distillation
— Unverified 0Enhancing Abstractiveness of Summarization Models through Calibrated Distillation Oct 20, 2023 Abstractive Text Summarization Informativeness
— Unverified 0FedDKD: Federated Learning with Decentralized Knowledge Distillation May 2, 2022 Federated Learning Knowledge Distillation
— Unverified 0FedDTG:Federated Data-Free Knowledge Distillation via Three-Player Generative Adversarial Networks Jan 10, 2022 Data-free Knowledge Distillation Federated Learning
— Unverified 0CourseGPT-zh: an Educational Large Language Model Based on Knowledge Distillation Incorporating Prompt Optimization May 8, 2024 Diversity Knowledge Distillation
— Unverified 0FedED: Federated Learning via Ensemble Distillation for Medical Relation Extraction Nov 1, 2020 Federated Learning Knowledge Distillation
— Unverified 0FedEFM: Federated Endovascular Foundation Model with Unseen Data Jan 28, 2025 Federated Learning Knowledge Distillation
— Unverified 0Federated Action Recognition on Heterogeneous Embedded Devices Jul 18, 2021 Action Recognition Federated Learning
— Unverified 0Federated Bayesian Neural Regression: A Scalable Global Federated Gaussian Process Jun 13, 2022 Federated Learning Knowledge Distillation
— Unverified 0Federated Deconfounding and Debiasing Learning for Out-of-Distribution Generalization May 8, 2025 Attribute Benchmarking
— Unverified 0Compressing Deep Image Super-resolution Models Dec 31, 2023 Image Super-Resolution Knowledge Distillation
— Unverified 0Gradient Adversarial Training of Neural Networks Jun 21, 2018 BIG-bench Machine Learning Binary Classification
— Unverified 0Enhanced Sparsification via Stimulative Training Mar 11, 2024 Knowledge Distillation Model Compression
— Unverified 0Federated Fine-Tuning of LLMs: Framework Comparison and Research Directions Jan 8, 2025 Federated Learning Knowledge Distillation
— Unverified 0Federated Graph Learning with Graphless Clients Nov 13, 2024 Graph Learning Knowledge Distillation
— Unverified 0CREFT: Sequential Multi-Agent LLM for Character Relation Extraction May 30, 2025 Knowledge Distillation Language Modeling
— Unverified 0Enhanced Multimodal Representation Learning with Cross-modal KD Jun 13, 2023 Contrastive Learning Emotion Classification
— Unverified 0Federated Knowledge Transfer Fine-tuning Large Server Model with Resource-Constrained IoT Clients Jul 7, 2024 Federated Learning Knowledge Distillation
— Unverified 0Federated Learning for Data and Model Heterogeneity in Medical Imaging Jul 31, 2023 Federated Learning Knowledge Distillation
— Unverified 0Compressed Meta-Optical Encoder for Image Classification Apr 23, 2024 Classification image-classification
— Unverified 0