Multi-Modality Distillation via Learning the teacher's modality-level Gram Matrix Dec 21, 2021 Knowledge Distillation
— Unverified 0Multimodal Locally Enhanced Transformer for Continuous Sign Language Recognition Aug 22, 2023 Knowledge Distillation Position
— Unverified 0Multimodal Prescriptive Deep Learning Jan 24, 2025 Deep Learning Knowledge Distillation
— Unverified 0Multi-Objective Diverse Human Motion Prediction With Knowledge Distillation Jan 1, 2022 Autonomous Driving Diversity
— Unverified 0Multi-Person Full Body Pose Estimation Aug 23, 2020 Knowledge Distillation Multi-Person Pose Estimation
— Unverified 0Multi-perspective Contrastive Logit Distillation Nov 16, 2024 Contrastive Learning image-classification
— Unverified 0Multiple Degradation and Reconstruction Network for Single Image Denoising via Knowledge Distillation Apr 29, 2022 Denoising Image Denoising
— Unverified 0Multi scale Feature Extraction and Fusion for Online Knowledge Distillation Jun 16, 2022 Knowledge Distillation Transfer Learning
— Unverified 0Learning to Purification for Unsupervised Person Re-identification Apr 21, 2022 Knowledge Distillation Person Re-Identification
— Unverified 0Multi-stage Distillation Framework for Cross-Lingual Semantic Similarity Matching Nov 16, 2021 Contrastive Learning Knowledge Distillation
— Unverified 0Multi-stage Progressive Compression of Conformer Transducer for On-device Speech Recognition Oct 1, 2022 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Multi-Strategy Knowledge Distillation Based Teacher-Student Framework for Machine Reading Comprehension Aug 1, 2021 Knowledge Distillation Machine Reading Comprehension
— Unverified 0Multitask Emotion Recognition Model with Knowledge Distillation and Task Discriminator Mar 24, 2022 Emotion Recognition Knowledge Distillation
— Unverified 0Multi-Task Learning with Knowledge Distillation for Dense Prediction Jan 1, 2023 Boundary Detection Depth Estimation
— Unverified 0Multi-Teacher Knowledge Distillation for Incremental Implicitly-Refined Classification Feb 23, 2022 Classification Incremental Learning
— Unverified 0Multivariate Prototype Representation for Domain-Generalized Incremental Learning Sep 24, 2023 class-incremental learning Class Incremental Learning
— Unverified 0Multi-View Attention Transfer for Efficient Speech Enhancement Aug 22, 2022 Knowledge Distillation Speech Enhancement
— Unverified 0Multi-View Feature Representation for Dialogue Generation with Bidirectional Distillation Feb 22, 2021 Dialogue Generation General Knowledge
— Unverified 0Multi-View Knowledge Distillation from Crowd Annotations for Out-of-Domain Generalization Dec 19, 2022 Domain Generalization Knowledge Distillation
— Unverified 0Multi-view knowledge distillation transformer for human action recognition Mar 25, 2023 Action Recognition Knowledge Distillation
— Unverified 0MUSE: Feature Self-Distillation with Mutual Information and Self-Information Oct 25, 2021 image-classification Image Classification
— Unverified 0MUST: A Multilingual Student-Teacher Learning approach for low-resource speech recognition Oct 29, 2023 Knowledge Distillation speech-recognition
— Unverified 0Mutual Adversarial Training: Learning together is better than going alone Dec 9, 2021 Knowledge Distillation
— Unverified 0Mutual Information Guided Backdoor Mitigation for Pre-trained Encoders Jun 5, 2024 Knowledge Distillation Self-Supervised Learning
— Unverified 0Mutual Learning for Finetuning Click-Through Rate Prediction Models Jun 17, 2024 Click-Through Rate Prediction Knowledge Distillation
— Unverified 0Mutual-Learning Improves End-to-End Speech Translation Nov 1, 2021 Knowledge Distillation Machine Translation
— Unverified 0Mutual Learning of Single- and Multi-Channel End-to-End Neural Diarization Oct 7, 2022 Knowledge Distillation speaker-diarization
— Unverified 0Mutually-paced Knowledge Distillation for Cross-lingual Temporal Knowledge Graph Reasoning Mar 27, 2023 Knowledge Distillation Knowledge Graphs
— Unverified 0MVKT-ECG: Efficient Single-lead ECG Classification on Multi-Label Arrhythmia by Multi-View Knowledge Transferring Jan 28, 2023 Diagnostic ECG Classification
— Unverified 0NAIST English-to-Japanese Simultaneous Translation System for IWSLT 2021 Simultaneous Text-to-text Task Aug 1, 2021 Knowledge Distillation Machine Translation
— Unverified 0Narrowing the Coordinate-frame Gap in Behavior Prediction Models: Distillation for Efficient and Accurate Scene-centric Motion Forecasting Jun 8, 2022 Autonomous Driving Knowledge Distillation
— Unverified 0NaturalReasoning: Reasoning in the Wild with 2.8M Challenging Questions Feb 18, 2025 Knowledge Distillation Math
— Unverified 0Natural Statistics of Network Activations and Implications for Knowledge Distillation Jun 1, 2021 Knowledge Distillation
— Unverified 0Nearest Neighbor Knowledge Distillation for Neural Machine Translation Jan 16, 2022 Knowledge Distillation Machine Translation
— Unverified 0Neighbourhood Distillation: On the benefits of non end-to-end distillation Oct 2, 2020 Knowledge Distillation Neural Architecture Search
— Unverified 0NEO-KD: Knowledge-Distillation-Based Adversarial Training for Robust Multi-Exit Neural Networks Nov 1, 2023 Knowledge Distillation
— Unverified 0NestedNet: Learning Nested Sparse Structures in Deep Neural Networks Dec 11, 2017 Knowledge Distillation Scheduling
— Unverified 0Network-Agnostic Knowledge Transfer for Medical Image Segmentation Jan 23, 2021 Image Segmentation Knowledge Distillation
— Unverified 0Reconstructing Pruned Filters using Cheap Spatial Transformations Oct 25, 2021 Feature Compression Knowledge Distillation
— Unverified 0Neural Architecture Search for Effective Teacher-Student Knowledge Transfer in Language Models Mar 16, 2023 CoLA CPU
— Unverified 0Neural Architecture Search via Ensemble-based Knowledge Distillation Sep 29, 2021 Diversity Knowledge Distillation
— Unverified 0Neural Collapse Inspired Knowledge Distillation Dec 16, 2024 Knowledge Distillation
— Unverified 0Neural Compatibility Modeling with Attentive Knowledge Distillation Apr 17, 2018 image-classification Image Classification
— Unverified 0Neural Machine Translation from Simplified Translations Dec 19, 2016 Knowledge Distillation Machine Translation
— Unverified 0NeuroComparatives: Neuro-Symbolic Distillation of Comparative Knowledge May 8, 2023 Knowledge Distillation valid
— Unverified 0New Perspective on Progressive GANs Distillation for One-class Novelty Detection Sep 15, 2021 Decoder Generative Adversarial Network
— Unverified 0NewsBERT: Distilling Pre-trained Language Model for Intelligent News Application Feb 9, 2021 Articles Knowledge Distillation
— Unverified 0NICEST: Noisy Label Correction and Training for Robust Scene Graph Generation Jul 27, 2022 Graph Generation Knowledge Distillation
— Unverified 0Nickel and Diming Your GAN: A Dual-Method Approach to Enhancing GAN Efficiency via Knowledge Distillation May 19, 2024 Knowledge Distillation
— Unverified 0NIFF: Alleviating Forgetting in Generalized Few-Shot Object Detection via Neural Instance Feature Forging Mar 9, 2023 Data-free Knowledge Distillation Few-Shot Object Detection
— Unverified 0