Domain Adaptation for Dense Retrieval through Self-Supervision by Pseudo-Relevance Labeling Dec 13, 2022 Domain Adaptation Information Retrieval
— Unverified 00 Domain Adaptive Hand Keypoint and Pixel Localization in the Wild Mar 16, 2022 Domain Adaptation Knowledge Distillation
— Unverified 00 Domain-Agnostic Clustering with Self-Distillation Nov 23, 2021 Clustering Data Augmentation
— Unverified 00 Domain Discrepancy Aware Distillation for Model Aggregation in Federated Learning Oct 4, 2022 Federated Learning Knowledge Distillation
— Unverified 00 Domain Generalization on Efficient Acoustic Scene Classification using Residual Normalization Nov 12, 2021 Acoustic Scene Classification Classification
— Unverified 00 Domain-invariant Feature Exploration for Domain Generalization Jul 25, 2022 Diversity Domain Generalization
— Unverified 00 Domain-invariant Progressive Knowledge Distillation for UAV-based Object Detection Aug 21, 2024 Knowledge Distillation Object
— Unverified 00 Domain Knowledge Distillation from Large Language Model: An Empirical Study in the Autonomous Driving Domain Jul 17, 2023 Autonomous Driving Knowledge Distillation
— Unverified 00 Domain-specific knowledge distillation yields smaller and better models for conversational commerce May 1, 2022 Knowledge Distillation Language Modeling
— Unverified 00 Domain-Specific Translation with Open-Source Large Language Models: Resource-Oriented Analysis Dec 8, 2024 Decoder Knowledge Distillation
— Unverified 00 DONNAv2 -- Lightweight Neural Architecture Search for Vision tasks Sep 26, 2023 Denoising Image Denoising
— Unverified 00 Do Not Blindly Imitate the Teacher: Using Perturbed Loss for Knowledge Distillation May 8, 2023 Knowledge Distillation
— Unverified 00 Do Not Forget to Attend to Uncertainty while Mitigating Catastrophic Forgetting Feb 3, 2021 Deep Learning Incremental Learning
— Unverified 00 Don't be picky, all students in the right family can learn from good teachers Jan 1, 2021 All Bayesian Optimization
— Unverified 00 Don't Throw Away Data: Better Sequence Knowledge Distillation Jul 15, 2024 Diversity Knowledge Distillation
— Unverified 00 DONUT-hole: DONUT Sparsification by Harnessing Knowledge and Optimizing Learning Efficiency Nov 9, 2023 document understanding Key Information Extraction
— Unverified 00 Doodle It Yourself: Class Incremental Learning by Drawing a Few Sketches Mar 28, 2022 class-incremental learning Class Incremental Learning
— Unverified 00 Double Reverse Regularization Network Based on Self-Knowledge Distillation for SAR Object Classification Nov 26, 2023 Knowledge Distillation Self-Knowledge Distillation
— Unverified 00 Double Similarity Distillation for Semantic Image Segmentation Jul 19, 2021 Image Segmentation Knowledge Distillation
— Unverified 00 Do We Really Need a Complex Agent System? Distill Embodied Agent into a Single Model Apr 6, 2024 Knowledge Distillation
— Unverified 00 DreamTeacher: Pretraining Image Backbones with Deep Generative Models Jul 14, 2023 Knowledge Distillation Representation Learning
— Unverified 00 DRKF: Distilled Rotated Kernel Fusion for Efficient Rotation Invariant Descriptors in Local Feature Matching Sep 22, 2022 Knowledge Distillation
— Unverified 00 DS3-Net: Difficulty-perceived Common-to-T1ce Semi-Supervised Multimodal MRI Synthesis Network Mar 14, 2022 Knowledge Distillation SSIM
— Unverified 00 DSFormer: Effective Compression of Text-Transformers by Dense-Sparse Weight Factorization Dec 20, 2023 Knowledge Distillation Natural Language Understanding
— Unverified 00 DSPNet: Towards Slimmable Pretrained Networks based on Discriminative Self-supervised Learning Jul 13, 2022 Knowledge Distillation Linear evaluation
— Unverified 00 DST: Dynamic Substitute Training for Data-free Black-box Attack Apr 3, 2022 Knowledge Distillation
— Unverified 00 DS-ViT: Dual-Stream Vision Transformer for Cross-Task Distillation in Alzheimer's Early Diagnosis Sep 11, 2024 Classification Knowledge Distillation
— Unverified 00 DTCM: Deep Transformer Capsule Mutual Distillation for Multivariate Time Series Classification Feb 26, 2024 Knowledge Distillation Relation Network
— Unverified 00 Dual Discriminator Adversarial Distillation for Data-free Model Compression Apr 12, 2021 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 00 Dual Embodied-Symbolic Concept Representations for Deep Learning Mar 1, 2022 class-incremental learning Class Incremental Learning
— Unverified 00 Dual-Head Knowledge Distillation: Enhancing Logits Utilization with an Auxiliary Head Nov 13, 2024 Attribute Knowledge Distillation
— Unverified 00 Dual Knowledge Distillation for Efficient Sound Event Detection Feb 5, 2024 Event Detection Knowledge Distillation
— Unverified 00 Dual-Modeling Decouple Distillation for Unsupervised Anomaly Detection Aug 7, 2024 Anomaly Detection Anomaly Localization
— Unverified 00 Dual Scale-aware Adaptive Masked Knowledge Distillation for Object Detection Jan 13, 2025 Knowledge Distillation object-detection
— Unverified 00 Dual-Student Knowledge Distillation Networks for Unsupervised Anomaly Detection Feb 1, 2024 Anomaly Detection Anomaly Segmentation
— Unverified 00 Dual-Teacher Class-Incremental Learning With Data-Free Generative Replay Jun 17, 2021 class-incremental learning Class Incremental Learning
— Unverified 00 Dual-Teacher: Integrating Intra-domain and Inter-domain Teachers for Annotation-efficient Cardiac Segmentation Jul 13, 2020 Cardiac Segmentation Domain Adaptation
— Unverified 00 Dual Teacher Knowledge Distillation with Domain Alignment for Face Anti-spoofing Jan 2, 2024 Adversarial Attack Face Anti-Spoofing
— Unverified 00 DualVC 2: Dynamic Masked Convolution for Unified Streaming and Non-Streaming Voice Conversion Sep 27, 2023 Decoder Knowledge Distillation
— Unverified 00 DualVC: Dual-mode Voice Conversion using Intra-model Knowledge Distillation and Hybrid Predictive Coding May 21, 2023 Data Augmentation Decoder
— Unverified 00 DuckSegmentation: A segmentation model based on the AnYue Hemp Duck Dataset Mar 27, 2025 Knowledge Distillation Object Recognition
— Unverified 00 DVFL: A Vertical Federated Learning Method for Dynamic Data Nov 5, 2021 Federated Learning Knowledge Distillation
— Unverified 00 DyLiN: Making Light Field Networks Dynamic Mar 24, 2023 Attribute Knowledge Distillation
— Unverified 00 Dynamic Activation with Knowledge Distillation for Energy-Efficient Spiking NN Ensembles Feb 19, 2025 Disentanglement Ensemble Learning
— Unverified 00 Dynamically pruning segformer for efficient semantic segmentation Nov 18, 2021 Knowledge Distillation Segmentation
— Unverified 00 DynamicKD: An Effective Knowledge Distillation via Dynamic Entropy Correction-Based Distillation for Gap Optimizing May 9, 2023 Knowledge Distillation
— Unverified 00 Dynamic Knowledge Distillation for Black-box Hypothesis Transfer Learning Jul 24, 2020 Knowledge Distillation Transfer Learning
— Unverified 00 Dynamic Knowledge Distillation With Noise Elimination for RGB-D Salient Object Detection Jun 17, 2021 Knowledge Distillation object-detection
— Unverified 00 Dynamic Low-Resolution Distillation for Cost-Efficient End-to-End Text Spotting Jul 14, 2022 global-optimization Knowledge Distillation
— Unverified 00 Dynamics-Adaptive Continual Reinforcement Learning via Progressive Contextualization Sep 1, 2022 Bayesian Inference Knowledge Distillation
— Unverified 00