Data-free Distillation with Degradation-prompt Diffusion for Multi-weather Image Restoration Sep 5, 2024 Image Restoration Knowledge Distillation
— Unverified 00 Data-Free Federated Class Incremental Learning with Diffusion-Based Generative Memory May 22, 2024 class-incremental learning Class Incremental Learning
— Unverified 00 Data-Free Knowledge Distillation Using Adversarially Perturbed OpenGL Shader Images Oct 20, 2023 Data Augmentation Data-free Knowledge Distillation
— Unverified 00 Data-Free Knowledge Distillation with Soft Targeted Transfer Set Synthesis Apr 10, 2021 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 00 Data-Free Knowledge Transfer: A Survey Dec 31, 2021 Data-free Knowledge Distillation Domain Adaptation
— Unverified 00 Mining Data Impressions from Deep Models as Substitute for the Unavailable Training Data Jan 15, 2021 Adversarial Robustness Continual Learning
— Unverified 00 Data Techniques For Online End-to-end Speech Recognition Jan 24, 2020 Data Augmentation Domain Adaptation
— Unverified 00 DC-CCL: Device-Cloud Collaborative Controlled Learning for Large Vision Models Mar 18, 2023 Knowledge Distillation
— Unverified 00 DCSNet: A Lightweight Knowledge Distillation-Based Model with Explainable AI for Lung Cancer Diagnosis from Histopathological Images May 14, 2025 Diagnostic Knowledge Distillation
— Unverified 00 DDK: Distilling Domain Knowledge for Efficient Large Language Models Jul 23, 2024 Knowledge Distillation
— Unverified 00 Dealing with Missing Modalities in the Visual Question Answer-Difference Prediction Task through Knowledge Distillation Apr 13, 2021 Knowledge Distillation Triplet
— Unverified 00 Dealing with training and test segmentation mismatch: FBK@IWSLT2021 Jun 23, 2021 Action Detection Activity Detection
— Unverified 00 DearKD: Data-Efficient Early Knowledge Distillation for Vision Transformers Apr 27, 2022 Knowledge Distillation
— Unverified 00 Debate, Reflect, and Distill: Multi-Agent Feedback with Tree-Structured Preference Optimization for Efficient Language Model Enhancement Jun 4, 2025 Knowledge Distillation Language Modeling
— Unverified 00 Debiased Distillation by Transplanting the Last Layer Feb 22, 2023 Attribute Knowledge Distillation
— Unverified 00 Debias the Black-box: A Fair Ranking Framework via Knowledge Distillation Aug 24, 2022 Fairness Information Retrieval
— Unverified 00 Decentralized and Model-Free Federated Learning: Consensus-Based Distillation in Function Space Apr 1, 2021 Federated Learning Knowledge Distillation
— Unverified 00 Decision Boundary-aware Knowledge Consolidation Generates Better Instance-Incremental Learner Jun 5, 2024 class-incremental learning Class Incremental Learning
— Unverified 00 De-confounded Data-free Knowledge Distillation for Handling Distribution Shifts Mar 28, 2024 Causal Inference Data-free Knowledge Distillation
— Unverified 00 Decoupled Alignment for Robust Plug-and-Play Adaptation Jun 3, 2024 Knowledge Distillation
— Unverified 00 Decoupled Transformer for Scalable Inference in Open-domain Question Answering Aug 5, 2021 Knowledge Distillation Machine Reading Comprehension
— Unverified 00 Decoupled Transformer for Scalable Inference in Open-domain Question Answering Sep 1, 2021 Knowledge Distillation Machine Reading Comprehension
— Unverified 00 Decouple Non-parametric Knowledge Distillation For End-to-end Speech Translation Apr 20, 2023 Knowledge Distillation Machine Translation
— Unverified 00 Decoupling Dark Knowledge via Block-wise Logit Distillation for Feature-level Alignment Nov 3, 2024 Knowledge Distillation Philosophy
— Unverified 00 Deep Collective Knowledge Distillation Apr 18, 2023 Knowledge Distillation Model Compression
— Unverified 00 Deep Epidemiological Modeling by Black-box Knowledge Distillation: An Accurate Deep Learning Model for COVID-19 Jan 20, 2021 Diversity Knowledge Distillation
— Unverified 00 Knowledge Distillation-aided End-to-End Learning for Linear Precoding in Multiuser MIMO Downlink Systems with Finite-Rate Feedback Aug 10, 2020 Binarization Knowledge Distillation
— Unverified 00 Deep Learning for Medical Text Processing: BERT Model Fine-Tuning and Comparative Study Oct 28, 2024 Knowledge Distillation
— Unverified 00 Deep Net Triage: Analyzing the Importance of Network Layers via Structural Compression Jan 15, 2018 Knowledge Distillation
— Unverified 00 Deep Neural Compression Via Concurrent Pruning and Self-Distillation Sep 30, 2021 Knowledge Distillation Language Modeling
— Unverified 00 Deep Neural Network Models Compression Mar 4, 2021 Knowledge Distillation Quantization
— Unverified 00 Deep Representation Learning of Patient Data from Electronic Health Records (EHR): A Systematic Review Oct 6, 2020 Articles Deep Learning
— Unverified 00 Deep Semi-Supervised and Self-Supervised Learning for Diabetic Retinopathy Detection Aug 4, 2022 Diabetic Retinopathy Detection Knowledge Distillation
— Unverified 00 Deep Serial Number: Computational Watermarking for DNN Intellectual Property Protection Nov 17, 2020 Knowledge Distillation valid
— Unverified 00 Deep-to-bottom Weights Decay: A Systemic Knowledge Review Learning Technique for Transformer Layers in Knowledge Distillation Nov 16, 2021 Knowledge Distillation
— Unverified 00 Deep versus Wide: An Analysis of Student Architectures for Task-Agnostic Knowledge Distillation of Self-Supervised Speech Models Jul 14, 2022 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 Defending against Data-Free Model Extraction by Distributionally Robust Defensive Training Sep 21, 2023 Knowledge Distillation Model extraction
— Unverified 00 Defending against Data-Free Model Extraction by Distributionally Robust Defensive Training Sep 21, 2023 Knowledge Distillation Model extraction
— Unverified 00 DeGAN : Data-Enriching GAN for Retrieving Representative Samples from a Trained Classifier Dec 27, 2019 Data-free Knowledge Distillation Incremental Learning
— Unverified 00 Delving Deep into Semantic Relation Distillation Mar 27, 2025 Knowledge Distillation Model Compression
— Unverified 00 Demystifying Catastrophic Forgetting in Two-Stage Incremental Object Detector Feb 8, 2025 Incremental Learning Knowledge Distillation
— Unverified 00 Denoising Mutual Knowledge Distillation in Bi-Directional Multiple Instance Learning May 17, 2025 Denoising image-classification
— Unverified 00 Densely Distilling Cumulative Knowledge for Continual Learning May 16, 2024 All Continual Learning
— Unverified 00 Deploying a BERT-based Query-Title Relevance Classifier in a Production System: a View from the Trenches Aug 23, 2021 CPU Data Augmentation
— Unverified 00 DεpS: Delayed ε-Shrinking for Faster Once-For-All Training Jul 8, 2024 All GPU
— Unverified 00 Designing and Training of Lightweight Neural Networks on Edge Devices using Early Halting in Knowledge Distillation Sep 30, 2022 Knowledge Distillation
— Unverified 00 Designing an Improved Deep Learning-based Model for COVID-19 Recognition in Chest X-ray Images: A Knowledge Distillation Approach Jan 6, 2023 Knowledge Distillation
— Unverified 00 Designing Parameter and Compute Efficient Diffusion Transformers using Distillation Feb 20, 2025 Knowledge Distillation NVIDIA Jetson Orin Nano
— Unverified 00 Detecting Optimism in Tweets using Knowledge Distillation and Linguistic Analysis of Optimism Jun 1, 2022 Hate Speech Detection Knowledge Distillation
— Unverified 00 DETRDistill: A Universal Knowledge Distillation Framework for DETR-families Nov 17, 2022 Knowledge Distillation object-detection
— Unverified 00