MT2KD: Towards A General-Purpose Encoder for Speech, Speaker, and Audio Events Sep 25, 2024 Audio Tagging Automatic Speech Recognition
— Unverified 00 MTKD: Multi-Teacher Knowledge Distillation for Image Super-Resolution Apr 15, 2024 Image Super-Resolution Knowledge Distillation
— Unverified 00 MulDE: Multi-teacher Knowledge Distillation for Low-dimensional Knowledge Graph Embeddings Oct 14, 2020 Graph Embedding Knowledge Distillation
— Unverified 00 Multi-adversarial Faster-RCNN with Paradigm Teacher for Unrestricted Object Detection Dec 11, 2022 Domain Adaptation Knowledge Distillation
— Unverified 00 Multi-Branch Mutual-Distillation Transformer for EEG-Based Seizure Subtype Classification Dec 4, 2024 EEG Electroencephalogram (EEG)
— Unverified 00 Multi-Channel Multi-Domain based Knowledge Distillation Algorithm for Sleep Staging with Single-Channel EEG Jan 7, 2024 EEG Knowledge Distillation
— Unverified 00 Cultural Commonsense Knowledge for Intercultural Dialogues Feb 16, 2024 Knowledge Distillation Specificity
— Unverified 00 Multi-Document Financial Question Answering using LLMs Nov 8, 2024 Knowledge Distillation Knowledge Graphs
— Unverified 00 Multi-Frame Self-Supervised Depth Estimation with Multi-Scale Feature Fusion in Dynamic Scenes Mar 26, 2023 Depth Estimation Knowledge Distillation
— Unverified 00 Multi-Frame to Single-Frame: Knowledge Distillation for 3D Object Detection Sep 24, 2020 3D Object Detection Autonomous Driving
— Unverified 00 Multi-Grained Knowledge Distillation for Named Entity Recognition Jun 1, 2021 Knowledge Distillation named-entity-recognition
— Unverified 00 Multi-Granularity Contrastive Knowledge Distillation for Multimodal Named Entity Recognition Nov 16, 2021 Knowledge Distillation Multi-modal Named Entity Recognition
— Unverified 00 Multi-Granularity Semantic Revision for Large Language Model Distillation Jul 14, 2024 Knowledge Distillation Language Modeling
— Unverified 00 Multi-head Knowledge Distillation for Model Compression Dec 5, 2020 image-classification Image Classification
— Unverified 00 Multi-label Class Incremental Emotion Decoding with Augmented Emotional Semantics Learning May 31, 2024 class-incremental learning Class Incremental Learning
— Unverified 00 Multi-label Contrastive Predictive Coding Jul 20, 2020 Knowledge Distillation Multi-class Classification
— Unverified 00 Multi-label Emotion Analysis in Conversation via Multimodal Knowledge Distillation Oct 27, 2023 Emotion Recognition Knowledge Distillation
— Unverified 00 Multi-level Distillation of Semantic Knowledge for Pre-training Multilingual Language Model Nov 2, 2022 Knowledge Distillation Language Modeling
— Unverified 00 Multilingual Neural Machine Translation:Can Linguistic Hierarchies Help? Oct 15, 2021 Knowledge Distillation Machine Translation
— Unverified 00 Multilingual Neural Machine Translation: Can Linguistic Hierarchies Help? Nov 1, 2021 Knowledge Distillation Machine Translation
— Unverified 00 Multi-MLLM Knowledge Distillation for Out-of-Context News Detection May 28, 2025 Knowledge Distillation Misinformation
— Unverified 00 Multimodal Commonsense Knowledge Distillation for Visual Question Answering Nov 5, 2024 Knowledge Distillation Question Answering
— Unverified 00 Multi-modal Cross-domain Self-supervised Pre-training for fMRI and EEG Fusion Sep 27, 2024 Data Augmentation EEG
— Unverified 00 Multi-Modal Few-Shot Object Detection with Meta-Learning-Based Cross-Modal Prompting Apr 16, 2022 Few-Shot Learning Few-Shot Object Detection
— Unverified 00 Multi-Modality Distillation via Learning the teacher's modality-level Gram Matrix Dec 21, 2021 Knowledge Distillation
— Unverified 00 Multimodal Locally Enhanced Transformer for Continuous Sign Language Recognition Aug 22, 2023 Knowledge Distillation Position
— Unverified 00 Multimodal Prescriptive Deep Learning Jan 24, 2025 Deep Learning Knowledge Distillation
— Unverified 00 Multi-Objective Diverse Human Motion Prediction With Knowledge Distillation Jan 1, 2022 Autonomous Driving Diversity
— Unverified 00 Multi-Person Full Body Pose Estimation Aug 23, 2020 Knowledge Distillation Multi-Person Pose Estimation
— Unverified 00 Multi-perspective Contrastive Logit Distillation Nov 16, 2024 Contrastive Learning image-classification
— Unverified 00 Multiple Degradation and Reconstruction Network for Single Image Denoising via Knowledge Distillation Apr 29, 2022 Denoising Image Denoising
— Unverified 00 Multi scale Feature Extraction and Fusion for Online Knowledge Distillation Jun 16, 2022 Knowledge Distillation Transfer Learning
— Unverified 00 Learning to Purification for Unsupervised Person Re-identification Apr 21, 2022 Knowledge Distillation Person Re-Identification
— Unverified 00 Multi-stage Distillation Framework for Cross-Lingual Semantic Similarity Matching Nov 16, 2021 Contrastive Learning Knowledge Distillation
— Unverified 00 Multi-stage Progressive Compression of Conformer Transducer for On-device Speech Recognition Oct 1, 2022 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 Multi-Strategy Knowledge Distillation Based Teacher-Student Framework for Machine Reading Comprehension Aug 1, 2021 Knowledge Distillation Machine Reading Comprehension
— Unverified 00 Multitask Emotion Recognition Model with Knowledge Distillation and Task Discriminator Mar 24, 2022 Emotion Recognition Knowledge Distillation
— Unverified 00 Multi-Task Learning with Knowledge Distillation for Dense Prediction Jan 1, 2023 Boundary Detection Depth Estimation
— Unverified 00 Multi-Teacher Knowledge Distillation for Incremental Implicitly-Refined Classification Feb 23, 2022 Classification Incremental Learning
— Unverified 00 Multivariate Prototype Representation for Domain-Generalized Incremental Learning Sep 24, 2023 class-incremental learning Class Incremental Learning
— Unverified 00 Multi-View Attention Transfer for Efficient Speech Enhancement Aug 22, 2022 Knowledge Distillation Speech Enhancement
— Unverified 00 Multi-View Feature Representation for Dialogue Generation with Bidirectional Distillation Feb 22, 2021 Dialogue Generation General Knowledge
— Unverified 00 Multi-View Knowledge Distillation from Crowd Annotations for Out-of-Domain Generalization Dec 19, 2022 Domain Generalization Knowledge Distillation
— Unverified 00 Multi-view knowledge distillation transformer for human action recognition Mar 25, 2023 Action Recognition Knowledge Distillation
— Unverified 00 MUSE: Feature Self-Distillation with Mutual Information and Self-Information Oct 25, 2021 image-classification Image Classification
— Unverified 00 MUST: A Multilingual Student-Teacher Learning approach for low-resource speech recognition Oct 29, 2023 Knowledge Distillation speech-recognition
— Unverified 00 Mutual Adversarial Training: Learning together is better than going alone Dec 9, 2021 Knowledge Distillation
— Unverified 00 Mutual Information Guided Backdoor Mitigation for Pre-trained Encoders Jun 5, 2024 Knowledge Distillation Self-Supervised Learning
— Unverified 00 Mutual Learning for Finetuning Click-Through Rate Prediction Models Jun 17, 2024 Click-Through Rate Prediction Knowledge Distillation
— Unverified 00 Mutual-Learning Improves End-to-End Speech Translation Nov 1, 2021 Knowledge Distillation Machine Translation
— Unverified 00