Developing Multi-Task Recommendations with Long-Term Rewards via Policy Distilled Reinforcement Learning Jan 27, 2020 Deep Reinforcement Learning Knowledge Distillation
— Unverified 00 Device-Directed Speech Detection: Regularization via Distillation for Weakly-Supervised Models Mar 30, 2022 Knowledge Distillation
— Unverified 00 DeViT: Decomposing Vision Transformers for Collaborative Inference in Edge Devices Sep 10, 2023 Collaborative Inference GPU
— Unverified 00 DFMSD: Dual Feature Masking Stage-wise Knowledge Distillation for Object Detection Jul 18, 2024 Knowledge Distillation Object
— Unverified 00 DFRD: Data-Free Robustness Distillation for Heterogeneous Federated Learning Sep 24, 2023 Data-free Knowledge Distillation Diversity
— Unverified 00 DiagrammaticLearning: A Graphical Language for Compositional Training Regimes Jan 2, 2025 Knowledge Distillation Multi-Task Learning
— Unverified 00 Dialect Identification through Adversarial Learning and Knowledge Distillation on Romanian BERT Apr 1, 2021 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 DFM: Dialogue Foundation Model for Universal Large-Scale Dialogue-Oriented Task Learning May 25, 2022 Dialogue Generation Diversity
— Unverified 00 DiDOTS: Knowledge Distillation from Large-Language-Models for Dementia Obfuscation in Transcribed Speech Oct 5, 2024 Hallucination Knowledge Distillation
— Unverified 00 Differentiable Feature Aggregation Search for Knowledge Distillation Aug 2, 2020 Knowledge Distillation Model Compression
— Unverified 00 Diffusion Glancing Transformer for Parallel Sequence to Sequence Learning Dec 20, 2022 Knowledge Distillation Machine Translation
— Unverified 00 Diffusion-Augmented Coreset Expansion for Scalable Dataset Distillation Dec 5, 2024 Bilevel Optimization Computational Efficiency
— Unverified 00 DiffusionTalker: Personalization and Acceleration for Speech-Driven 3D Face Diffuser Nov 28, 2023 3D Face Animation Contrastive Learning
— Unverified 00 Digging Deeper into CRNN Model in Chinese Text Images Recognition Nov 17, 2020 Denoising Knowledge Distillation
— Unverified 00 Digital Twin-Assisted Knowledge Distillation Framework for Heterogeneous Federated Learning Mar 10, 2023 Federated Learning Knowledge Distillation
— Unverified 00 DilateQuant: Accurate and Efficient Diffusion Quantization via Weight Dilation Sep 22, 2024 Image Generation Knowledge Distillation
— Unverified 00 DILEMMA: Joint LLM Quantization and Distributed LLM Inference Over Edge Computing Systems Mar 3, 2025 Edge-computing Knowledge Distillation
— Unverified 00 DiPair: Fast and Accurate Distillation for Trillion-Scale Text Matching and Pair Modeling Oct 7, 2020 Knowledge Distillation Question Answering
— Unverified 00 Direct Alignment of Draft Model for Speculative Decoding with Chat-Fine-Tuned LLMs Feb 29, 2024 Dataset Generation Knowledge Distillation
— Unverified 00 Direct Distillation between Different Domains Jan 12, 2024 Domain Adaptation Knowledge Distillation
— Unverified 00 Direct Preference Knowledge Distillation for Large Language Models Jun 28, 2024 Knowledge Distillation
— Unverified 00 DiReDi: Distillation and Reverse Distillation for AIoT Applications Sep 12, 2024 Knowledge Distillation Management
— Unverified 00 Disentanglement, Visualization and Analysis of Complex Features in DNNs Jan 1, 2021 Disentanglement Knowledge Distillation
— Unverified 00 DistilDoc: Knowledge Distillation for Visually-Rich Document Applications Jun 12, 2024 document-image-classification Document Image Classification
— Unverified 00 DualDE: Dually Distilling Knowledge Graph Embedding for Faster and Cheaper Reasoning Sep 13, 2020 Graph Embedding Knowledge Distillation
— Unverified 00 Distill, Adapt, Distill: Training Small, In-Domain Models for Neural Machine Translation Mar 5, 2020 Domain Adaptation Knowledge Distillation
— Unverified 00 Distill and De-bias: Mitigating Bias in Face Verification using Knowledge Distillation Dec 17, 2021 Attribute Face Recognition
— Unverified 00 Knowledge Distillation Decision Tree for Unravelling Black-box Machine Learning Models Jun 9, 2022 Knowledge Distillation
— Unverified 00 Distillation-Enabled Knowledge Alignment for Generative Semantic Communications in AIGC Provisioning Tasks Jun 24, 2025 Knowledge Distillation Semantic Communication
— Unverified 00 Distillation-Enhanced Physical Adversarial Attacks Jan 4, 2025 Adversarial Attack Knowledge Distillation
— Unverified 00 StableMamba: Distillation-free Scaling of Large SSMs for Images and Videos Sep 18, 2024 Action Recognition image-classification
— Unverified 00 Distillation of Diffusion Features for Semantic Correspondence Dec 4, 2024 3D Reconstruction Data Augmentation
— Unverified 00 Distillation of Human-Object Interaction Contexts for Action Recognition Dec 17, 2021 Action Recognition Graph Attention
— Unverified 00 Distillation of Weighted Automata from Recurrent Neural Networks using a Spectral Approach Sep 28, 2020 Knowledge Distillation Language Modelling
— Unverified 00 Distillation Using Oracle Queries for Transformer-Based Human-Object Interaction Detection Jan 1, 2022 Data Augmentation Decoder
— Unverified 00 Distillation with Contrast is All You Need for Self-Supervised Point Cloud Representation Learning Feb 9, 2022 All Contrastive Learning
— Unverified 00 Distilled ChatGPT Topic & Sentiment Modeling with Applications in Finance Mar 4, 2024 Knowledge Distillation Sentiment Analysis
— Unverified 00 Improving Word Embedding Factorization for Compression Using Distilled Nonlinear Neural Decomposition Oct 2, 2019 Knowledge Distillation Language Modeling
— Unverified 00 Distilled embedding: non-linear embedding factorization using knowledge distillation Sep 25, 2019 Knowledge Distillation Machine Translation
— Unverified 00 Distilled Mid-Fusion Transformer Networks for Multi-Modal Human Activity Recognition May 5, 2023 Activity Recognition Feature Engineering
— Unverified 00 Distilled Reverse Attention Network for Open-world Compositional Zero-Shot Learning Mar 1, 2023 Compositional Zero-Shot Learning Knowledge Distillation
— Unverified 00 Distilling 3D distinctive local descriptors for 6D pose estimation Mar 19, 2025 6D Pose Estimation Knowledge Distillation
— Unverified 00 Distilling a Deep Neural Network into a Takagi-Sugeno-Kang Fuzzy Inference System Oct 10, 2020 General Classification Knowledge Distillation
— Unverified 00 Distilling Adversarial Robustness Using Heterogeneous Teachers Feb 23, 2024 Adversarial Robustness Knowledge Distillation
— Unverified 00 Distilling Calibrated Student from an Uncalibrated Teacher Feb 22, 2023 Data Augmentation Knowledge Distillation
— Unverified 00 Distilling CLIP with Dual Guidance for Learning Discriminative Human Body Shape Representation Jan 1, 2024 Knowledge Distillation Person Re-Identification
— Unverified 00 Augmenting Offline Reinforcement Learning with State-only Interactions Feb 1, 2024 D4RL Data Augmentation
— Unverified 00 Distilling Cross-Temporal Contexts for Continuous Sign Language Recognition Jan 1, 2023 Knowledge Distillation Sign Language Recognition
— Unverified 00 Distilling EEG Representations via Capsules for Affective Computing Apr 30, 2021 EEG Electroencephalogram (EEG)
— Unverified 00 Distilling Efficient Vision Transformers from CNNs for Semantic Segmentation Oct 11, 2023 Knowledge Distillation Semantic Segmentation
— Unverified 00