De-confounded Data-free Knowledge Distillation for Handling Distribution Shifts Mar 28, 2024 Causal Inference Data-free Knowledge Distillation
— Unverified 00 Confidence-aware Self-Semantic Distillation on Knowledge Graph Embedding Jun 7, 2022 Graph Embedding Knowledge Distillation
— Unverified 00 GVP: Generative Volumetric Primitives Mar 31, 2023 Image Generation Knowledge Distillation
— Unverified 00 Improving Multi-Task Deep Neural Networks via Knowledge Distillation for Natural Language Understanding Apr 20, 2019 Ensemble Learning Knowledge Distillation
— Unverified 00 Guiding Teacher Forcing with Seer Forcing for Neural Machine Translation Jun 12, 2021 Decoder Knowledge Distillation
— Unverified 00 Bilateral Memory Consolidation for Continual Learning Jan 1, 2023 Continual Learning Knowledge Distillation
— Unverified 00 Guiding CTC Posterior Spike Timings for Improved Posterior Fusion and Knowledge Distillation Apr 17, 2019 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 Improving Neural ODEs via Knowledge Distillation Mar 10, 2022 Knowledge Distillation
— Unverified 00 Guided Deep Metric Learning Jun 4, 2022 Few-Shot Learning Knowledge Distillation
— Unverified 00 GTCOM Neural Machine Translation Systems for WMT19 Aug 1, 2019 Knowledge Distillation Language Modeling
— Unverified 00 Decision Boundary-aware Knowledge Consolidation Generates Better Instance-Incremental Learner Jun 5, 2024 class-incremental learning Class Incremental Learning
— Unverified 00 Improving Pronunciation and Accent Conversion through Knowledge Distillation And Synthetic Ground-Truth from Native TTS Oct 19, 2024 Knowledge Distillation
— Unverified 00 Growing Deep Neural Network Considering with Similarity between Neurons Aug 23, 2024 Decision Making Knowledge Distillation
— Unverified 00 Decentralized and Model-Free Federated Learning: Consensus-Based Distillation in Function Space Apr 1, 2021 Federated Learning Knowledge Distillation
— Unverified 00 Debias the Black-box: A Fair Ranking Framework via Knowledge Distillation Aug 24, 2022 Fairness Information Retrieval
— Unverified 00 Improving Route Choice Models by Incorporating Contextual Factors via Knowledge Distillation Mar 27, 2019 Knowledge Distillation Management
— Unverified 00 Always Strengthen Your Strengths: A Drift-Aware Incremental Learning Framework for CTR Prediction Apr 17, 2023 Click-Through Rate Prediction Diversity
— Unverified 00 Adaptively Integrated Knowledge Distillation and Prediction Uncertainty for Continual Learning Jan 18, 2023 Continual Learning Knowledge Distillation
— Unverified 00 A Closer Look at Knowledge Distillation with Features, Logits, and Gradients Mar 18, 2022 Incremental Learning Knowledge Distillation
— Unverified 00 Sentence-wise Speech Summarization: Task, Datasets, and End-to-End Modeling with LM Knowledge Distillation Aug 1, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 AdvFunMatch: When Consistent Teaching Meets Adversarial Robustness May 24, 2023 Adversarial Robustness Knowledge Distillation
— Unverified 00 Group-Mix SAM: Lightweight Solution for Industrial Assembly Line Applications Mar 15, 2024 Knowledge Distillation
— Unverified 00 Debiased Distillation by Transplanting the Last Layer Feb 22, 2023 Attribute Knowledge Distillation
— Unverified 00 Improving the Interpretability of Deep Neural Networks with Knowledge Distillation Dec 28, 2018 Ethics Knowledge Distillation
— Unverified 00 Grouped Knowledge Distillation for Deep Face Recognition Apr 10, 2023 Face Recognition Knowledge Distillation
— Unverified 00 Group Distributionally Robust Knowledge Distillation Nov 1, 2023 Knowledge Distillation
— Unverified 00 Improving Video Model Transfer With Dynamic Representation Learning Jan 1, 2022 Action Classification Knowledge Distillation
— Unverified 00 Debate, Reflect, and Distill: Multi-Agent Feedback with Tree-Structured Preference Optimization for Efficient Language Model Enhancement Jun 4, 2025 Knowledge Distillation Language Modeling
— Unverified 00 Group channel pruning and spatial attention distilling for object detection Jun 2, 2023 Knowledge Distillation Model Compression
— Unverified 00 Improving Zero-Shot Multilingual Text Generation via Iterative Distillation Oct 1, 2022 Knowledge Distillation Text Generation
— Unverified 00 Ground-V: Teaching VLMs to Ground Complex Instructions in Pixels May 20, 2025 Instruction Following Knowledge Distillation
— Unverified 00 In-Context Learning Distillation for Efficient Few-Shot Fine-Tuning Dec 17, 2024 In-Context Learning Knowledge Distillation
— Unverified 00 DearKD: Data-Efficient Early Knowledge Distillation for Vision Transformers Apr 27, 2022 Knowledge Distillation
— Unverified 00 GripRank: Bridging the Gap between Retrieval and Generation via the Generative Knowledge Improved Passage Ranking May 29, 2023 Answer Generation Dialogue Generation
— Unverified 00 Dealing with training and test segmentation mismatch: FBK@IWSLT2021 Jun 23, 2021 Action Detection Activity Detection
— Unverified 00 Incremental Classifier Learning Based on PEDCC-Loss and Cosine Distance Jun 11, 2019 Incremental Learning Knowledge Distillation
— Unverified 00 Bi-CryptoNets: Leveraging Different-Level Privacy for Encrypted Inference Feb 2, 2024 Knowledge Distillation Privacy Preserving
— Unverified 00 Incremental Knowledge Based Question Answering Jan 18, 2021 Incremental Learning Knowledge Distillation
— Unverified 00 Incremental Learning for End-to-End Automatic Speech Recognition May 11, 2020 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 AKE-GNN: Effective Graph Learning with Adaptive Knowledge Exchange Jun 10, 2021 Classification Graph Classification
— Unverified 00 Graph Representation Learning via Multi-task Knowledge Distillation Nov 11, 2019 Graph Representation Learning Knowledge Distillation
— Unverified 00 Incrementally-Computable Neural Networks: Efficient Inference for Dynamic Inputs Jul 27, 2023 Document Classification Knowledge Distillation
— Unverified 00 Dealing with Missing Modalities in the Visual Question Answer-Difference Prediction Task through Knowledge Distillation Apr 13, 2021 Knowledge Distillation Triplet
— Unverified 00 DDK: Distilling Domain Knowledge for Efficient Large Language Models Jul 23, 2024 Knowledge Distillation
— Unverified 00 DCSNet: A Lightweight Knowledge Distillation-Based Model with Explainable AI for Lung Cancer Diagnosis from Histopathological Images May 14, 2025 Diagnostic Knowledge Distillation
— Unverified 00 Be Your Own Best Competitor! Multi-Branched Adversarial Knowledge Transfer Oct 9, 2020 Decoder image-classification
— Unverified 00 Incrementer: Transformer for Class-Incremental Semantic Segmentation With Knowledge Distillation Focusing on Old Class Jan 1, 2023 Class-Incremental Semantic Segmentation Decoder
— Unverified 00 ALP-KD: Attention-Based Layer Projection for Knowledge Distillation Dec 27, 2020 Knowledge Distillation
— Unverified 00 Adaptive Label Smoothing with Self-Knowledge in Natural Language Generation Oct 22, 2022 Knowledge Distillation Text Generation
— Unverified 00 DC-CCL: Device-Cloud Collaborative Controlled Learning for Large Vision Models Mar 18, 2023 Knowledge Distillation
— Unverified 00