NLDF: Neural Light Dynamic Fields for Efficient 3D Talking Head Generation Jun 17, 2024 Knowledge Distillation NeRF
— Unverified 0No Forgetting Learning: Memory-free Continual Learning Mar 6, 2025 Continual Learning Knowledge Distillation
— Unverified 0Noise-Tolerant Few-Shot Unsupervised Adapter for Vision-Language Models Sep 26, 2023 image-classification Image Classification
— Unverified 0Noisy Machines: Understanding Noisy Neural Networks and Enhancing Robustness to Analog Hardware Errors Using Distillation Jan 14, 2020 Knowledge Distillation
— Unverified 0Noisy Neural Network Compression for Analog Storage Devices Oct 19, 2020 Knowledge Distillation Model Compression
— Unverified 0Non-Autoregressive Sign Language Production via Knowledge Distillation Aug 12, 2022 Knowledge Distillation Sign Language Production
— Unverified 0Non-target Divergence Hypothesis: Toward Understanding Domain Gaps in Cross-Modal Knowledge Distillation Sep 4, 2024 Knowledge Distillation
— Unverified 0No One Left Behind: Inclusive Federated Learning over Heterogeneous Devices Feb 16, 2022 Federated Learning Knowledge Distillation
— Unverified 0Normalized Feature Distillation for Semantic Segmentation Jul 12, 2022 Knowledge Distillation Model Compression
— Unverified 0Not All Knowledge Is Created Equal: Mutual Distillation of Confident Knowledge Jun 2, 2021 All Knowledge Distillation
— Unverified 0Not All Regions are Worthy to be Distilled: Region-aware Knowledge Distillation Towards Efficient Image-to-Image Translation Sep 29, 2021 All Contrastive Learning
— Unverified 0Not to Overfit or Underfit the Source Domains? An Empirical Study of Domain Generalization in Question Answering May 15, 2022 Domain Generalization Knowledge Distillation
— Unverified 0NovaCOMET: Open Commonsense Foundation Models with Symbolic Knowledge Distillation Dec 10, 2023 Knowledge Distillation
— Unverified 0Novel Visual Category Discovery with Dual Ranking Statistics and Mutual Knowledge Distillation Jul 7, 2021 Fine-Grained Visual Recognition Knowledge Distillation
— Unverified 0NVIDIA NeMo Neural Machine Translation Systems for English-German and English-Russian News and Biomedical Tasks at WMT21 Nov 16, 2021 Data Augmentation Knowledge Distillation
— Unverified 0NVIDIA NeMo’s Neural Machine Translation Systems for English-German and English-Russian News and Biomedical Tasks at WMT21 Nov 1, 2021 Data Augmentation Knowledge Distillation
— Unverified 0NxMTransformer: Semi-Structured Sparsification for Natural Language Understanding via ADMM Oct 28, 2021 Knowledge Distillation Natural Language Understanding
— Unverified 0NYCU-TWO at Memotion 3: Good Foundation, Good Teacher, then you have Good Meme Analysis Feb 13, 2023 Knowledge Distillation Sentiment Analysis
— Unverified 0oBERTa: Improving Sparse Transfer Learning via improved initialization, distillation, and pruning regimes Mar 30, 2023 Knowledge Distillation Model Compression
— Unverified 0Object-centric Cross-modal Feature Distillation for Event-based Object Detection Nov 9, 2023 Knowledge Distillation Object
— Unverified 0Object-Centric Diffusion for Efficient Video Editing Jan 11, 2024 Knowledge Distillation Object
— Unverified 0OccludeNeRF: Geometric-aware 3D Scene Inpainting with Collaborative Score Distillation in NeRF Apr 1, 2025 Denoising Knowledge Distillation
— Unverified 0Occlusion-Robust FAU Recognition by Mining Latent Space of Masked Autoencoders Dec 8, 2022 Knowledge Distillation
— Unverified 0Offline-to-Online Knowledge Distillation for Video Instance Segmentation Feb 15, 2023 Data Augmentation Instance Segmentation
— Unverified 0Oh! We Freeze: Improving Quantized Knowledge Distillation via Signal Propagation Analysis for Large Language Models Mar 26, 2024 Knowledge Distillation Quantization
— Unverified 0OmniScience: A Domain-Specialized LLM for Scientific Reasoning and Discovery Mar 22, 2025 Knowledge Distillation
— Unverified 0On Accelerating Edge AI: Optimizing Resource-Constrained Environments Jan 25, 2025 Knowledge Distillation Model Compression
— Unverified 0On Compressing U-net Using Knowledge Distillation Dec 1, 2018 Knowledge Distillation
— Unverified 0Deakin RF-Sensing: Experiments on Correlated Knowledge Distillation for Monitoring Human Postures with Radios May 24, 2023 Knowledge Distillation
— Unverified 0On-Device Constrained Self-Supervised Speech Representation Learning for Keyword Spotting via Knowledge Distillation Jul 6, 2023 Keyword Spotting Knowledge Distillation
— Unverified 0On Distilling the Displacement Knowledge for Few-Shot Class-Incremental Learning Dec 15, 2024 class-incremental learning Class Incremental Learning
— Unverified 0One Category One Prompt: Dataset Distillation using Diffusion Models Mar 11, 2024 Dataset Distillation Knowledge Distillation
— Unverified 0One-Class Knowledge Distillation for Spoofing Speech Detection Sep 15, 2023 Binary Classification Knowledge Distillation
— Unverified 0On effects of Knowledge Distillation on Transfer Learning Oct 18, 2022 image-classification Image Classification
— Unverified 0One General Teacher for Multi-Data Multi-Task: A New Knowledge Distillation Framework for Discourse Relation Analysis Nov 16, 2021 Knowledge Distillation Multi-Task Learning
— Unverified 0On Elastic Language Models Nov 13, 2023 Information Retrieval Knowledge Distillation
— Unverified 0One-Shot Federated Learning for LEO Constellations that Reduces Convergence Time from Days to 90 Minutes May 21, 2023 Federated Learning Knowledge Distillation
— Unverified 0On Estimating the Training Cost of Conversational Recommendation Systems Nov 10, 2020 Conversational Recommendation Knowledge Distillation
— Unverified 0One-stop Training of Multiple Capacity Models May 23, 2023 Knowledge Distillation Machine Translation
— Unverified 0One Student Knows All Experts Know: From Sparse to Dense Jan 26, 2022 All Knowledge Distillation
— Unverified 0One Teacher is Enough? Pre-trained Language Model Distillation from Multiple Teachers Jun 2, 2021 Knowledge Distillation Language Modeling
— Unverified 0On Explaining Knowledge Distillation: Measuring and Visualising the Knowledge Transfer Process Dec 18, 2024 Knowledge Distillation Transfer Learning
— Unverified 0On Generalizing Beyond Domains in Cross-Domain Continual Learning Mar 8, 2022 Continual Learning Knowledge Distillation
— Unverified 0On Good Practices for Task-Specific Distillation of Large Pretrained Visual Models Feb 17, 2024 Data Augmentation Knowledge Distillation
— Unverified 0On Knowledge Distillation for Direct Speech Translation Dec 9, 2020 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0On Knowledge Distillation for Translating Erroneous Speech Transcriptions Aug 1, 2021 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0On Knowledge distillation from complex networks for response prediction Jun 1, 2019 Knowledge Distillation Question Answering
— Unverified 0Online Continual Learning For Visual Food Classification Aug 15, 2021 Classification Continual Learning
— Unverified 0Online Continual Learning via the Meta-learning Update with Multi-scale Knowledge Distillation and Data Augmentation Sep 12, 2022 Continual Learning Data Augmentation
— Unverified 0Online Cross-Layer Knowledge Distillation on Graph Neural Networks with Deep Supervision Oct 25, 2022 Knowledge Distillation Model Compression
— Unverified 0