Cross-Architecture Knowledge Distillation Jul 12, 2022 Knowledge Distillation
— Unverified 0Enhancing Chinese Multi-Label Text Classification Performance with Response-based Knowledge Distillation Nov 1, 2022 Knowledge Distillation Multi Label Text Classification
— Unverified 0A Technical Study into Small Reasoning Language Models Jun 16, 2025 Code Generation Computational Efficiency
— Unverified 0Federated Learning with Privacy-Preserving Ensemble Attention Distillation Oct 16, 2022 Federated Learning image-classification
— Unverified 0Hard Gate Knowledge Distillation -- Leverage Calibration for Robust and Reliable Language Model Oct 22, 2022 Knowledge Distillation Language Modeling
— Unverified 0Federated Semi-Supervised Domain Adaptation via Knowledge Transfer Jul 21, 2022 Domain Adaptation Federated Learning
— Unverified 0Enhancing Adversarial Training with Prior Knowledge Distillation for Robust Image Compression Mar 11, 2024 Backdoor Attack Image Compression
— Unverified 0Compressing Image-to-Image Translation GANs Using Local Density Structures on Their Learned Manifold Dec 22, 2023 Density Estimation Image-to-Image Translation
— Unverified 0Compressing GANs using Knowledge Distillation Feb 1, 2019 Knowledge Distillation Super-Resolution
— Unverified 0Adaptive Affinity-Based Generalization For MRI Imaging Segmentation Across Resource-Limited Settings Apr 3, 2024 Data Integration Knowledge Distillation
— Unverified 0Enhancing Action Recognition from Low-Quality Skeleton Data via Part-Level Knowledge Distillation Apr 28, 2024 Action Recognition General Knowledge
— Unverified 0FedKD: Communication Efficient Federated Learning via Knowledge Distillation Aug 30, 2021 Federated Learning Knowledge Distillation
— Unverified 0A Generalized and Robust Method Towards Practical Gaze Estimation on Smart Phone Oct 16, 2019 Gaze Estimation Knowledge Distillation
— Unverified 0Handling Long-tailed Feature Distribution in AdderNets Dec 1, 2021 Knowledge Distillation
— Unverified 0Enhancing Accuracy and Parameter-Efficiency of Neural Representations for Network Parameterization Jun 29, 2024 Knowledge Distillation
— Unverified 0Enhancing Abstractiveness of Summarization Models through Calibrated Distillation Oct 20, 2023 Abstractive Text Summarization Informativeness
— Unverified 0Compressing Deep Image Super-resolution Models Dec 31, 2023 Image Super-Resolution Knowledge Distillation
— Unverified 0FedRAD: Federated Robust Adaptive Distillation Dec 2, 2021 Federated Learning Knowledge Distillation
— Unverified 0Hands-on Guidance for Distilling Object Detectors Mar 26, 2021 Knowledge Distillation Object
— Unverified 0FedSDD: Scalable and Diversity-enhanced Distillation for Model Aggregation in Federated Learning Dec 28, 2023 Diversity Federated Learning
— Unverified 0Cross-Level Multi-Instance Distillation for Self-Supervised Fine-Grained Visual Categorization Jan 16, 2024 Fine-Grained Visual Categorization Knowledge Distillation
— Unverified 0FedSKD: Aggregation-free Model-heterogeneous Federated Learning using Multi-dimensional Similarity Knowledge Distillation Mar 23, 2025 Federated Learning Knowledge Distillation
— Unverified 0HARD: Hard Augmentations for Robust Distillation May 24, 2023 Data Augmentation Domain Generalization
— Unverified 0FedSPLIT: One-Shot Federated Recommendation System Based on Non-negative Joint Matrix Factorization and Knowledge Distillation May 4, 2022 Collaborative Filtering Federated Learning
— Unverified 0FedTAD: Topology-aware Data-free Knowledge Distillation for Subgraph Federated Learning Apr 22, 2024 Data-free Knowledge Distillation Federated Learning
— Unverified 0FedUD: Exploiting Unaligned Data for Cross-Platform Federated Click-Through Rate Prediction Jul 26, 2024 Click-Through Rate Prediction Federated Learning
— Unverified 0Enhanced Sparsification via Stimulative Training Mar 11, 2024 Knowledge Distillation Model Compression
— Unverified 0Enhanced Multimodal Representation Learning with Cross-modal KD Jun 13, 2023 Contrastive Learning Emotion Classification
— Unverified 0FEED: Feature-level Ensemble Effect for knowledge Distillation May 1, 2019 Knowledge Distillation Transfer Learning
— Unverified 0FEED: Feature-level Ensemble for Knowledge Distillation Sep 24, 2019 Knowledge Distillation
— Unverified 0Compressed Meta-Optical Encoder for Image Classification Apr 23, 2024 Classification image-classification
— Unverified 0Energy-efficient Knowledge Distillation for Spiking Neural Networks Jun 14, 2021 Knowledge Distillation Model Compression
— Unverified 0Comprehensive Survey of Model Compression and Speed up for Vision Transformers Apr 16, 2024 Computational Efficiency Edge-computing
— Unverified 0After-Stroke Arm Paresis Detection using Kinematic Data Nov 3, 2023 Action Classification Knowledge Distillation
— Unverified 0End-to-End Speech-Translation with Knowledge Distillation: FBK@IWSLT2020 Jun 4, 2020 Data Augmentation Knowledge Distillation
— Unverified 0Cross Modal Distillation for Flood Extent Mapping Feb 16, 2023 Knowledge Distillation
— Unverified 0End-to-End Speech Translation with Knowledge Distillation Apr 17, 2019 Knowledge Distillation speech-recognition
— Unverified 0Few-shot learning of neural networks from scratch by pseudo example optimization Feb 8, 2018 Few-Shot Learning Knowledge Distillation
— Unverified 0Comprehensive Study on Performance Evaluation and Optimization of Model Compression: Bridging Traditional Deep Learning and Large Language Models Jul 22, 2024 Deep Learning image-classification
— Unverified 0End-to-End Simultaneous Speech Translation with Pretraining and Distillation: Huawei Noah’s System for AutoSimTranS 2022 Jul 1, 2022 Decoder Knowledge Distillation
— Unverified 0FGAD: Self-boosted Knowledge Distillation for An Effective Federated Graph Anomaly Detection Framework Feb 20, 2024 Anomaly Detection Federated Learning
— Unverified 0A methodology for training homomorphicencryption friendly neural networks Nov 5, 2021 Knowledge Distillation Privacy Preserving
— Unverified 0End-to-end fully-binarized network design: from Generic Learned Thermometer to Block Pruning May 5, 2025 Knowledge Distillation Quantization
— Unverified 0Fine-Grained Distillation for Long Document Retrieval Dec 20, 2022 Knowledge Distillation Retrieval
— Unverified 0Fine-grained Image Retrieval via Dual-Vision Adaptation Jun 19, 2025 Image Retrieval Knowledge Distillation
— Unverified 0Cross-modal knowledge distillation for action recognition Oct 10, 2019 Action Recognition Knowledge Distillation
— Unverified 0Fine-tune Before Structured Pruning: Towards Compact and Accurate Self-Supervised Models for Speaker Diarization May 30, 2025 GPU Knowledge Distillation
— Unverified 0Fine-tuning a Multiple Instance Learning Feature Extractor with Masked Context Modelling and Knowledge Distillation Mar 8, 2024 Image Generation Knowledge Distillation
— Unverified 0Comprehensive Pathological Image Segmentation via Teacher Aggregation for Tumor Microenvironment Analysis Jan 6, 2025 Decision Making Diversity
— Unverified 0Edge Bias in Federated Learning and its Solution by Buffered Knowledge Distillation Oct 20, 2020 Federated Learning Knowledge Distillation
— Unverified 0