Mutual Learning of Single- and Multi-Channel End-to-End Neural Diarization Oct 7, 2022 Knowledge Distillation speaker-diarization
— Unverified 00 Mutually-paced Knowledge Distillation for Cross-lingual Temporal Knowledge Graph Reasoning Mar 27, 2023 Knowledge Distillation Knowledge Graphs
— Unverified 00 MVKT-ECG: Efficient Single-lead ECG Classification on Multi-Label Arrhythmia by Multi-View Knowledge Transferring Jan 28, 2023 Diagnostic ECG Classification
— Unverified 00 NAIST English-to-Japanese Simultaneous Translation System for IWSLT 2021 Simultaneous Text-to-text Task Aug 1, 2021 Knowledge Distillation Machine Translation
— Unverified 00 Narrowing the Coordinate-frame Gap in Behavior Prediction Models: Distillation for Efficient and Accurate Scene-centric Motion Forecasting Jun 8, 2022 Autonomous Driving Knowledge Distillation
— Unverified 00 NaturalReasoning: Reasoning in the Wild with 2.8M Challenging Questions Feb 18, 2025 Knowledge Distillation Math
— Unverified 00 Natural Statistics of Network Activations and Implications for Knowledge Distillation Jun 1, 2021 Knowledge Distillation
— Unverified 00 Nearest Neighbor Knowledge Distillation for Neural Machine Translation Jan 16, 2022 Knowledge Distillation Machine Translation
— Unverified 00 Neighbourhood Distillation: On the benefits of non end-to-end distillation Oct 2, 2020 Knowledge Distillation Neural Architecture Search
— Unverified 00 NEO-KD: Knowledge-Distillation-Based Adversarial Training for Robust Multi-Exit Neural Networks Nov 1, 2023 Knowledge Distillation
— Unverified 00 NestedNet: Learning Nested Sparse Structures in Deep Neural Networks Dec 11, 2017 Knowledge Distillation Scheduling
— Unverified 00 Network-Agnostic Knowledge Transfer for Medical Image Segmentation Jan 23, 2021 Image Segmentation Knowledge Distillation
— Unverified 00 Reconstructing Pruned Filters using Cheap Spatial Transformations Oct 25, 2021 Feature Compression Knowledge Distillation
— Unverified 00 Neural Architecture Search for Effective Teacher-Student Knowledge Transfer in Language Models Mar 16, 2023 CoLA CPU
— Unverified 00 Neural Architecture Search via Ensemble-based Knowledge Distillation Sep 29, 2021 Diversity Knowledge Distillation
— Unverified 00 Neural Collapse Inspired Knowledge Distillation Dec 16, 2024 Knowledge Distillation
— Unverified 00 Neural Compatibility Modeling with Attentive Knowledge Distillation Apr 17, 2018 image-classification Image Classification
— Unverified 00 Neural Machine Translation from Simplified Translations Dec 19, 2016 Knowledge Distillation Machine Translation
— Unverified 00 NeuroComparatives: Neuro-Symbolic Distillation of Comparative Knowledge May 8, 2023 Knowledge Distillation valid
— Unverified 00 New Perspective on Progressive GANs Distillation for One-class Novelty Detection Sep 15, 2021 Decoder Generative Adversarial Network
— Unverified 00 NewsBERT: Distilling Pre-trained Language Model for Intelligent News Application Feb 9, 2021 Articles Knowledge Distillation
— Unverified 00 NICEST: Noisy Label Correction and Training for Robust Scene Graph Generation Jul 27, 2022 Graph Generation Knowledge Distillation
— Unverified 00 Nickel and Diming Your GAN: A Dual-Method Approach to Enhancing GAN Efficiency via Knowledge Distillation May 19, 2024 Knowledge Distillation
— Unverified 00 NIFF: Alleviating Forgetting in Generalized Few-Shot Object Detection via Neural Instance Feature Forging Mar 9, 2023 Data-free Knowledge Distillation Few-Shot Object Detection
— Unverified 00 NLDF: Neural Light Dynamic Fields for Efficient 3D Talking Head Generation Jun 17, 2024 Knowledge Distillation NeRF
— Unverified 00 No Forgetting Learning: Memory-free Continual Learning Mar 6, 2025 Continual Learning Knowledge Distillation
— Unverified 00 Noise-Tolerant Few-Shot Unsupervised Adapter for Vision-Language Models Sep 26, 2023 image-classification Image Classification
— Unverified 00 Noisy Machines: Understanding Noisy Neural Networks and Enhancing Robustness to Analog Hardware Errors Using Distillation Jan 14, 2020 Knowledge Distillation
— Unverified 00 Noisy Neural Network Compression for Analog Storage Devices Oct 19, 2020 Knowledge Distillation Model Compression
— Unverified 00 Non-Autoregressive Sign Language Production via Knowledge Distillation Aug 12, 2022 Knowledge Distillation Sign Language Production
— Unverified 00 Non-target Divergence Hypothesis: Toward Understanding Domain Gaps in Cross-Modal Knowledge Distillation Sep 4, 2024 Knowledge Distillation
— Unverified 00 No One Left Behind: Inclusive Federated Learning over Heterogeneous Devices Feb 16, 2022 Federated Learning Knowledge Distillation
— Unverified 00 Normalized Feature Distillation for Semantic Segmentation Jul 12, 2022 Knowledge Distillation Model Compression
— Unverified 00 Not All Knowledge Is Created Equal: Mutual Distillation of Confident Knowledge Jun 2, 2021 All Knowledge Distillation
— Unverified 00 Not All Regions are Worthy to be Distilled: Region-aware Knowledge Distillation Towards Efficient Image-to-Image Translation Sep 29, 2021 All Contrastive Learning
— Unverified 00 Not to Overfit or Underfit the Source Domains? An Empirical Study of Domain Generalization in Question Answering May 15, 2022 Domain Generalization Knowledge Distillation
— Unverified 00 NovaCOMET: Open Commonsense Foundation Models with Symbolic Knowledge Distillation Dec 10, 2023 Knowledge Distillation
— Unverified 00 Novel Visual Category Discovery with Dual Ranking Statistics and Mutual Knowledge Distillation Jul 7, 2021 Fine-Grained Visual Recognition Knowledge Distillation
— Unverified 00 NVIDIA NeMo Neural Machine Translation Systems for English-German and English-Russian News and Biomedical Tasks at WMT21 Nov 16, 2021 Data Augmentation Knowledge Distillation
— Unverified 00 NVIDIA NeMo’s Neural Machine Translation Systems for English-German and English-Russian News and Biomedical Tasks at WMT21 Nov 1, 2021 Data Augmentation Knowledge Distillation
— Unverified 00