Low-Resolution Object Recognition with Cross-Resolution Relational Contrastive Distillation Sep 4, 2024 Face Recognition Knowledge Distillation
— Unverified 00 Low Resource Causal Event Detection from Biomedical Literature May 1, 2022 Event Detection Knowledge Distillation
— Unverified 00 Low-resource Low-footprint Wake-word Detection using Knowledge Distillation Jul 6, 2022 Knowledge Distillation speech-recognition
— Unverified 00 LRC-BERT: Latent-representation Contrastive Knowledge Distillation for Natural Language Understanding Dec 14, 2020 Contrastive Learning Knowledge Distillation
— Unverified 00 LRSpeech: Extremely Low-Resource Speech Synthesis and Recognition Aug 9, 2020 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 LTD: Low Temperature Distillation for Robust Adversarial Training Nov 3, 2021 Knowledge Distillation
— Unverified 00 M2KD: Multi-model and Multi-level Knowledge Distillation for Incremental Learning Apr 3, 2019 Incremental Learning Knowledge Distillation
— Unverified 00 MadEye: Boosting Live Video Analytics Accuracy with Adaptive Camera Configurations Apr 4, 2023 Knowledge Distillation
— Unverified 00 Making Neural Machine Reading Comprehension Faster Mar 29, 2019 Knowledge Distillation Machine Reading Comprehension
— Unverified 00 Making Small Language Models Better Few-Shot Learners Nov 16, 2021 Few-Shot Learning Knowledge Distillation
— Unverified 00 Mamba base PKD for efficient knowledge compression Mar 3, 2025 image-classification Image Classification
— Unverified 00 MambaLiteSR: Image Super-Resolution with Low-Rank Mamba using Knowledge Distillation Feb 19, 2025 Image Super-Resolution Knowledge Distillation
— Unverified 00 Many-to-One Knowledge Distillation of Real-Time Epileptic Seizure Detection for Low-Power Wearable Internet of Things Systems Jul 20, 2022 Edge-computing Knowledge Distillation
— Unverified 00 MapDistill: Boosting Efficient Camera-based HD Map Construction via Camera-LiDAR Fusion Model Distillation Jul 16, 2024 Autonomous Driving Knowledge Distillation
— Unverified 00 Map-Free Trajectory Prediction with Map Distillation and Hierarchical Encoding Nov 17, 2024 Autonomous Vehicles Decoder
— Unverified 00 Marine Saliency Segmenter: Object-Focused Conditional Diffusion with Region-Level Semantic Knowledge Distillation Apr 3, 2025 Knowledge Distillation Segmentation
— Unverified 00 Markowitz Meets Bellman: Knowledge-distilled Reinforcement Learning for Portfolio Management May 8, 2024 Knowledge Distillation Management
— Unverified 00 Masked Autoencoders Are Stronger Knowledge Distillers Jan 1, 2023 Decoder Knowledge Distillation
— Unverified 00 The Role of Masking for Efficient Supervised Knowledge Distillation of Vision Transformers Feb 21, 2023 Knowledge Distillation
— Unverified 00 Masked Modeling Duo for Speech: Specializing General-Purpose Audio Representation to Speech using Denoising Distillation May 23, 2023 Denoising Knowledge Distillation
— Unverified 00 Matching Distributions between Model and Data: Cross-domain Knowledge Distillation for Unsupervised Domain Adaptation Aug 1, 2021 Cross-Domain Text Classification Domain Adaptation
— Unverified 00 Maximizing Discrimination Capability of Knowledge Distillation with Energy Function Nov 24, 2023 Data Augmentation Knowledge Distillation
— Unverified 00 Maximum Likelihood Distillation for Robust Modulation Classification Nov 1, 2022 Classification Knowledge Distillation
— Unverified 00 MCF-VC: Mitigate Catastrophic Forgetting in Class-Incremental Learning for Multimodal Video Captioning Feb 27, 2024 class-incremental learning Class Incremental Learning
— Unverified 00 Enhancing Metaphor Detection through Soft Labels and Target Word Prediction Mar 27, 2024 Knowledge Distillation Prompt Learning
— Unverified 00 Measuring and Reducing Model Update Regression in Structured Prediction for NLP Feb 7, 2022 Dependency Parsing Knowledge Distillation
— Unverified 00 Medical Image Segmentation on MRI Images with Missing Modalities: A Review Mar 11, 2022 Image Generation Image Segmentation
— Unverified 00 MEDIC: Remove Model Backdoors via Importance Driven Cloning Jan 1, 2023 Knowledge Distillation model
— Unverified 00 MedMAP: Promoting Incomplete Multi-modal Brain Tumor Segmentation with Alignment Aug 18, 2024 Brain Tumor Segmentation Domain Adaptation
— Unverified 00 MED-TEX: Transferring and Explaining Knowledge with Less Data from Pretrained Medical Imaging Models Aug 6, 2020 image-classification Image Classification
— Unverified 00 Membership Privacy Protection for Image Translation Models via Adversarial Knowledge Distillation Mar 10, 2022 Image-to-Image Translation Inference Attack
— Unverified 00 MentalMAC: Enhancing Large Language Models for Detecting Mental Manipulation via Multi-Task Anti-Curriculum Distillation May 21, 2025 Knowledge Distillation
— Unverified 00 MergeDistill: Merging Pre-trained Language Models using Distillation Jun 5, 2021 Cross-Lingual Transfer Knowledge Distillation
— Unverified 00 MergeNet: Knowledge Migration across Heterogeneous Models, Tasks, and Modalities Apr 20, 2024 Knowledge Distillation Transfer Learning
— Unverified 00 MetaDistiller: Network Self-Boosting via Meta-Learned Top-Down Distillation Aug 27, 2020 Knowledge Distillation Meta-Learning
— Unverified 00 Meta-Ensemble Parameter Learning Oct 5, 2022 Knowledge Distillation Meta-Learning
— Unverified 00 Meta-KD: A Meta Knowledge Distillation Framework for Language Model Compression across Domains Dec 2, 2020 Knowledge Distillation Language Modeling
— Unverified 00 Meta Knowledge Distillation Feb 16, 2022 Data Augmentation Image Classification
— Unverified 00 Meta-Learning across Meta-Tasks for Few-Shot Learning Feb 11, 2020 Domain Adaptation Few-Shot Learning
— Unverified 00 MetaMixer: A Regularization Strategy for Online Knowledge Distillation Mar 14, 2023 Knowledge Distillation
— Unverified 00 MH-pFLID: Model Heterogeneous personalized Federated Learning via Injection and Distillation for Medical Data Analysis May 10, 2024 Federated Learning Knowledge Distillation
— Unverified 00 MIAShield: Defending Membership Inference Attacks via Preemptive Exclusion of Members Mar 2, 2022 image-classification Image Classification
— Unverified 00 MICIK: MIning Cross-Layer Inherent Similarity Knowledge for Deep Model Compression Feb 3, 2019 Knowledge Distillation Model Compression
— Unverified 00 Microdosing: Knowledge Distillation for GAN based Compression Jan 7, 2022 Knowledge Distillation Video Compression
— Unverified 00 Microsoft Research Asia's Systems for WMT19 Nov 7, 2019 Data Augmentation Knowledge Distillation
— Unverified 00 MIKO: Multimodal Intention Knowledge Distillation from Large Language Models for Social-Media Commonsense Discovery Feb 28, 2024 Knowledge Distillation Language Modeling
— Unverified 00 Mimic and Conquer: Heterogeneous Tree Structure Distillation for Syntactic NLP Sep 16, 2020 Knowledge Distillation
— Unverified 00 MIND: Modality-Informed Knowledge Distillation Framework for Multimodal Clinical Prediction Tasks Feb 3, 2025 Imputation Knowledge Distillation
— Unverified 00 Mind the Gap Between Synthetic and Real: Utilizing Transfer Learning to Probe the Boundaries of Stable Diffusion Generated Data May 6, 2024 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 00 Mind the Gap: Promoting Missing Modality Brain Tumor Segmentation with Alignment Sep 28, 2024 Brain Tumor Segmentation Knowledge Distillation
— Unverified 00