| Disease Entity Recognition and Normalization is Improved with Large Language Model Derived Synthetic Normalized Mentions | Oct 10, 2024 | Data AugmentationKnowledge Graphs | —Unverified | 0 | 0 |
| Disentangling Homophemes in Lip Reading using Perplexity Analysis | Nov 28, 2020 | Language ModelingLanguage Modelling | —Unverified | 0 | 0 |
| Disentangling Knowledge Representations for Large Language Model Editing | May 24, 2025 | Disentanglementknowledge editing | —Unverified | 0 | 0 |
| Disentangling Reasoning Tokens and Boilerplate Tokens For Language Model Fine-tuning | Dec 19, 2024 | DisentanglementLanguage Modeling | —Unverified | 0 | 0 |
| Disfluency Detection using a Noisy Channel Model and a Deep Neural Language Model | Aug 28, 2018 | Language ModelingLanguage Modelling | —Unverified | 0 | 0 |
| Disney at IEST 2018: Predicting Emotions using an Ensemble | Oct 1, 2018 | Emotion ClassificationLanguage Modeling | —Unverified | 0 | 0 |
| Disrupting Vision-Language Model-Driven Navigation Services via Adversarial Object Fusion | May 29, 2025 | Language ModelingLanguage Modelling | —Unverified | 0 | 0 |
| Distant-supervised Language Model for Detecting Emotional Upsurge on Twitter | Oct 1, 2015 | Language ModelingLanguage Modelling | —Unverified | 0 | 0 |
| Distillation Strategies for Discriminative Speech Recognition Rescoring | Jun 15, 2023 | Language ModelingLanguage Modelling | —Unverified | 0 | 0 |
| Improving Word Embedding Factorization for Compression Using Distilled Nonlinear Neural Decomposition | Oct 2, 2019 | Knowledge DistillationLanguage Modeling | —Unverified | 0 | 0 |
| Distilling Event Sequence Knowledge From Large Language Models | Jan 14, 2024 | Language ModelingLanguage Modelling | —Unverified | 0 | 0 |
| Distilling Knowledge from Pre-trained Language Models via Text Smoothing | May 8, 2020 | Knowledge DistillationLanguage Modeling | —Unverified | 0 | 0 |
| Distilling Relation Embeddings from Pretrained Language Models | Nov 1, 2021 | Knowledge GraphsLanguage Modeling | —Unverified | 0 | 0 |
| Distilling Relation Embeddings from Pre-trained Language Models | Sep 21, 2021 | Knowledge GraphsLanguage Modeling | —Unverified | 0 | 0 |
| Distilling the Knowledge of BERT for CTC-based ASR | Sep 5, 2022 | Automatic Speech RecognitionAutomatic Speech Recognition (ASR) | —Unverified | 0 | 0 |
| Distilling Vision-Language Models on Millions of Videos | Jan 11, 2024 | Language ModelingLanguage Modelling | —Unverified | 0 | 0 |
| Distil-xLSTM: Learning Attention Mechanisms through Recurrent Structures | Mar 24, 2025 | Language ModelingLanguage Modelling | —Unverified | 0 | 0 |
| Distinguishing Human Generated Text From ChatGPT Generated Text Using Machine Learning | May 26, 2023 | Language ModelingLanguage Modelling | —Unverified | 0 | 0 |
| Distortion-free Watermarks are not Truly Distortion-free under Watermark Key Collisions | Jun 2, 2024 | Language ModelingLanguage Modelling | —Unverified | 0 | 0 |
| Distraction is All You Need for Multimodal Large Language Model Jailbreaking | Feb 15, 2025 | AllLanguage Modeling | —Unverified | 0 | 0 |
| Distributed Fine-tuning of Language Models on Private Data | Jan 1, 2018 | General KnowledgeLanguage Modeling | —Unverified | 0 | 0 |
| Distributed representation and estimation of WFST-based n-gram models | Aug 1, 2016 | Automatic Speech Recognition (ASR)Language Modeling | —Unverified | 0 | 0 |
| Distributed Representation for Traditional Chinese Medicine Herb via Deep Learning Models | Nov 6, 2017 | Language ModelingLanguage Modelling | —Unverified | 0 | 0 |
| Distributed Threat Intelligence at the Edge Devices: A Large Language Model-Driven Approach | May 14, 2024 | Edge-computingIn-Context Learning | —Unverified | 0 | 0 |
| Distributionally Robust Recurrent Decoders with Random Network Distillation | Oct 25, 2021 | Language ModelingLanguage Modelling | —Unverified | 0 | 0 |