AgentDistill: Training-Free Agent Distillation with Generalizable MCP Boxes Jun 17, 2025 Knowledge Distillation Transfer Learning
— Unverified 0FedQUIT: On-Device Federated Unlearning via a Quasi-Competent Virtual Teacher Aug 14, 2024 Federated Learning Knowledge Distillation
— Unverified 0Attention-Guided Answer Distillation for Machine Reading Comprehension Aug 23, 2018 Knowledge Distillation Machine Reading Comprehension
— Unverified 0FedRAD: Federated Robust Adaptive Distillation Dec 2, 2021 Federated Learning Knowledge Distillation
— Unverified 0Ensemble Knowledge Distillation for CTR Prediction Nov 8, 2020 Click-Through Rate Prediction Knowledge Distillation
— Unverified 0A Generative Framework for Personalized Learning and Estimation: Theory, Algorithms, and Privacy Jul 5, 2022 Federated Learning Knowledge Distillation
— Unverified 0Enhancing Systematic Decompositional Natural Language Inference Using Informal Logic Feb 22, 2024 Formal Logic Knowledge Distillation
— Unverified 0Conditional Autoregressors are Interpretable Classifiers Mar 31, 2022 Classification image-classification
— Unverified 0FedSDD: Scalable and Diversity-enhanced Distillation for Model Aggregation in Federated Learning Dec 28, 2023 Diversity Federated Learning
— Unverified 0FedSPLIT: One-Shot Federated Recommendation System Based on Non-negative Joint Matrix Factorization and Knowledge Distillation May 4, 2022 Collaborative Filtering Federated Learning
— Unverified 0FGAD: Self-boosted Knowledge Distillation for An Effective Federated Graph Anomaly Detection Framework Feb 20, 2024 Anomaly Detection Federated Learning
— Unverified 0Ensemble Distillation for Neural Machine Translation Feb 6, 2017 Knowledge Distillation Machine Translation
— Unverified 0Conditional Generative Data-free Knowledge Distillation Dec 31, 2021 Conditional Image Generation Data-free Knowledge Distillation
— Unverified 0Enhancing SLM via ChatGPT and Dataset Augmentation Sep 19, 2024 Knowledge Distillation Natural Language Inference
— Unverified 0Enhancing Single-Slice Segmentation with 3D-to-2D Unpaired Scan Distillation Jun 18, 2024 Computed Tomography (CT) Knowledge Distillation
— Unverified 0Ensemble knowledge distillation of self-supervised speech models Feb 24, 2023 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Condensed Sample-Guided Model Inversion for Knowledge Distillation Aug 25, 2024 Knowledge Distillation model
— Unverified 0Enhancing Semi-supervised Learning with Zero-shot Pseudolabels Feb 18, 2025 Knowledge Distillation
— Unverified 0Confidence Based Bidirectional Global Context Aware Training Framework for Neural Machine Translation Feb 28, 2022 Decoder Knowledge Distillation
— Unverified 0ConceptDistil: Model-Agnostic Distillation of Concept Explanations May 7, 2022 Explainable Models Knowledge Distillation
— Unverified 0Ensembling of Distilled Models from Multi-task Teachers for Constrained Resource Language Pairs Nov 26, 2021 Knowledge Distillation Translation
— Unverified 0EnSiam: Self-Supervised Learning With Ensemble Representations May 22, 2023 Contrastive Learning Knowledge Distillation
— Unverified 0Entire-Space Variational Information Exploitation for Post-Click Conversion Rate Prediction Dec 17, 2024 Knowledge Distillation Recommendation Systems
— Unverified 0EPIK: Eliminating multi-model Pipelines with Knowledge-distillation Nov 27, 2022 Knowledge Distillation Transliteration
— Unverified 0EPSD: Early Pruning with Self-Distillation for Efficient Model Compression Jan 31, 2024 Knowledge Distillation Model Compression
— Unverified 0Learning Effective Representations for Retrieval Using Self-Distillation with Adaptive Relevance Margins Jul 31, 2024 Knowledge Distillation Language Modeling
— Unverified 0Federated Learning with Privacy-Preserving Ensemble Attention Distillation Oct 16, 2022 Federated Learning image-classification
— Unverified 0ERNIE-Search: Bridging Cross-Encoder with Dual-Encoder via Self On-the-fly Distillation for Dense Passage Retrieval May 18, 2022 Knowledge Distillation Open-Domain Question Answering
— Unverified 0Conformer with dual-mode chunked attention for joint online and offline ASR Jun 22, 2022 Knowledge Distillation
— Unverified 0Error Exponent in Agnostic PAC Learning May 1, 2024 Binary Classification Knowledge Distillation
— Unverified 0Enhancing Scalability in Recommender Systems through Lottery Ticket Hypothesis and Knowledge Distillation-based Neural Network Pruning Jan 19, 2024 GPU Knowledge Distillation
— Unverified 0Enhancing Romanian Offensive Language Detection through Knowledge Distillation, Multi-Task Learning, and Data Augmentation Sep 30, 2024 Data Augmentation Knowledge Distillation
— Unverified 0Enhancing Review Comprehension with Domain-Specific Commonsense Apr 6, 2020 Aspect Extraction Knowledge Distillation
— Unverified 0ESPnet-ST IWSLT 2021 Offline Speech Translation System Jul 1, 2021 Decoder Knowledge Distillation
— Unverified 0Enhancing Once-For-All: A Study on Parallel Blocks, Skip Connections and Early Exits Feb 3, 2023 All Knowledge Distillation
— Unverified 0ConaCLIP: Exploring Distillation of Fully-Connected Knowledge Interaction Graph for Lightweight Text-Image Retrieval May 28, 2023 Image Retrieval Knowledge Distillation
— Unverified 0A General Multiple Data Augmentation Based Framework for Training Deep Neural Networks May 29, 2022 Data Augmentation image-classification
— Unverified 0Evaluation-oriented Knowledge Distillation for Deep Face Recognition Jun 6, 2022 Face Recognition Knowledge Distillation
— Unverified 0Federated One-Shot Learning with Data Privacy and Objective-Hiding Apr 29, 2025 Federated Learning Information Retrieval
— Unverified 0A Transformer-in-Transformer Network Utilizing Knowledge Distillation for Image Recognition Feb 24, 2025 image-classification Image Classification
— Unverified 0Federated Learning for Data and Model Heterogeneity in Medical Imaging Jul 31, 2023 Federated Learning Knowledge Distillation
— Unverified 0Enhancing Modality-Agnostic Representations via Meta-Learning for Brain Tumor Segmentation Feb 8, 2023 Brain Tumor Segmentation Image Generation
— Unverified 0Enhancing Mapless Trajectory Prediction through Knowledge Distillation Jun 25, 2023 Autonomous Driving Knowledge Distillation
— Unverified 0Compression of end-to-end non-autoregressive image-to-speech system for low-resourced devices Nov 30, 2023 Knowledge Distillation
— Unverified 0Federated Learning on Non-iid Data via Local and Global Distillation Jun 26, 2023 Federated Learning Knowledge Distillation
— Unverified 0EVOKE: Emotion Enabled Virtual Avatar Mapping Using Optimized Knowledge Distillation Jan 13, 2024 Emotion Recognition Knowledge Distillation
— Unverified 0Compression of Deep Learning Models for Text: A Survey Aug 12, 2020 Deep Learning Information Retrieval
— Unverified 0Generalized Supervised Contrastive Learning Jun 1, 2022 Contrastive Learning Knowledge Distillation
— Unverified 0Compression of Acoustic Event Detection Models With Quantized Distillation Jul 1, 2019 Event Detection Knowledge Distillation
— Unverified 0Federated Knowledge Transfer Fine-tuning Large Server Model with Resource-Constrained IoT Clients Jul 7, 2024 Federated Learning Knowledge Distillation
— Unverified 0