Enhancing Metaphor Detection through Soft Labels and Target Word Prediction Mar 27, 2024 Knowledge Distillation Prompt Learning
— Unverified 0Measuring and Reducing Model Update Regression in Structured Prediction for NLP Feb 7, 2022 Dependency Parsing Knowledge Distillation
— Unverified 0Medical Image Segmentation on MRI Images with Missing Modalities: A Review Mar 11, 2022 Image Generation Image Segmentation
— Unverified 0MEDIC: Remove Model Backdoors via Importance Driven Cloning Jan 1, 2023 Knowledge Distillation model
— Unverified 0MedMAP: Promoting Incomplete Multi-modal Brain Tumor Segmentation with Alignment Aug 18, 2024 Brain Tumor Segmentation Domain Adaptation
— Unverified 0MED-TEX: Transferring and Explaining Knowledge with Less Data from Pretrained Medical Imaging Models Aug 6, 2020 image-classification Image Classification
— Unverified 0Membership Privacy Protection for Image Translation Models via Adversarial Knowledge Distillation Mar 10, 2022 Image-to-Image Translation Inference Attack
— Unverified 0MentalMAC: Enhancing Large Language Models for Detecting Mental Manipulation via Multi-Task Anti-Curriculum Distillation May 21, 2025 Knowledge Distillation
— Unverified 0MergeDistill: Merging Pre-trained Language Models using Distillation Jun 5, 2021 Cross-Lingual Transfer Knowledge Distillation
— Unverified 0MergeNet: Knowledge Migration across Heterogeneous Models, Tasks, and Modalities Apr 20, 2024 Knowledge Distillation Transfer Learning
— Unverified 0MetaDistiller: Network Self-Boosting via Meta-Learned Top-Down Distillation Aug 27, 2020 Knowledge Distillation Meta-Learning
— Unverified 0Meta-Ensemble Parameter Learning Oct 5, 2022 Knowledge Distillation Meta-Learning
— Unverified 0Meta-KD: A Meta Knowledge Distillation Framework for Language Model Compression across Domains Dec 2, 2020 Knowledge Distillation Language Modeling
— Unverified 0Meta Knowledge Distillation Feb 16, 2022 Data Augmentation Image Classification
— Unverified 0Meta-Learning across Meta-Tasks for Few-Shot Learning Feb 11, 2020 Domain Adaptation Few-Shot Learning
— Unverified 0MetaMixer: A Regularization Strategy for Online Knowledge Distillation Mar 14, 2023 Knowledge Distillation
— Unverified 0MH-pFLID: Model Heterogeneous personalized Federated Learning via Injection and Distillation for Medical Data Analysis May 10, 2024 Federated Learning Knowledge Distillation
— Unverified 0MIAShield: Defending Membership Inference Attacks via Preemptive Exclusion of Members Mar 2, 2022 image-classification Image Classification
— Unverified 0MICIK: MIning Cross-Layer Inherent Similarity Knowledge for Deep Model Compression Feb 3, 2019 Knowledge Distillation Model Compression
— Unverified 0Microdosing: Knowledge Distillation for GAN based Compression Jan 7, 2022 Knowledge Distillation Video Compression
— Unverified 0Microsoft Research Asia's Systems for WMT19 Nov 7, 2019 Data Augmentation Knowledge Distillation
— Unverified 0MIKO: Multimodal Intention Knowledge Distillation from Large Language Models for Social-Media Commonsense Discovery Feb 28, 2024 Knowledge Distillation Language Modeling
— Unverified 0Mimic and Conquer: Heterogeneous Tree Structure Distillation for Syntactic NLP Sep 16, 2020 Knowledge Distillation
— Unverified 0MIND: Modality-Informed Knowledge Distillation Framework for Multimodal Clinical Prediction Tasks Feb 3, 2025 Imputation Knowledge Distillation
— Unverified 0Mind the Gap Between Synthetic and Real: Utilizing Transfer Learning to Probe the Boundaries of Stable Diffusion Generated Data May 6, 2024 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 0Mind the Gap: Promoting Missing Modality Brain Tumor Segmentation with Alignment Sep 28, 2024 Brain Tumor Segmentation Knowledge Distillation
— Unverified 0Minimally Invasive Surgery for Sparse Neural Networks in Contrastive Manner Jun 19, 2021 Knowledge Distillation Model Compression
— Unverified 0Mini-ResEmoteNet: Leveraging Knowledge Distillation for Human-Centered Design Jan 30, 2025 Emotion Recognition Facial Emotion Recognition
— Unverified 0MiniVLN: Efficient Vision-and-Language Navigation by Progressive Knowledge Distillation Sep 27, 2024 Knowledge Distillation Vision and Language Navigation
— Unverified 0MinT: Boosting Generalization in Mathematical Reasoning via Multi-View Fine-Tuning Jul 16, 2023 Knowledge Distillation Mathematical Reasoning
— Unverified 0Mitigating Cross-client GANs-based Attack in Federated Learning Jul 25, 2023 Data-free Knowledge Distillation Federated Learning
— Unverified 0Mitigating Gender Bias in Distilled Language Models via Counterfactual Role Reversal Mar 23, 2022 counterfactual Fairness
— Unverified 0Mitigating Hallucination with ZeroG: An Advanced Knowledge Management Engine Nov 8, 2024 Computational Efficiency Hallucination
— Unverified 0Mixed Distillation Helps Smaller Language Model Better Reasoning Dec 17, 2023 Knowledge Distillation Language Modeling
— Unverified 0Mixed-Type Wafer Classification For Low Memory Devices Using Knowledge Distillation Mar 24, 2023 Knowledge Distillation Lightweight Deployment
— Unverified 0MixKD: Towards Efficient Distillation of Large-scale Language Models Nov 1, 2020 Data Augmentation Knowledge Distillation
— Unverified 0A Guide To Effectively Leveraging LLMs for Low-Resource Text Summarization: Data Augmentation and Semi-supervised Approaches Jul 10, 2024 Abstractive Text Summarization Data Augmentation
— Unverified 0MKF-ADS: Multi-Knowledge Fusion Based Self-supervised Anomaly Detection System for Control Area Network Mar 7, 2024 Anomaly Detection Intrusion Detection
— Unverified 0MK-SGN: A Spiking Graph Convolutional Network with Multimodal Fusion and Knowledge Distillation for Skeleton-based Action Recognition Apr 16, 2024 Action Recognition Knowledge Distillation
— Unverified 0MLKD-BERT: Multi-level Knowledge Distillation for Pre-trained Language Models Jul 3, 2024 Extractive Question-Answering Knowledge Distillation
— Unverified 0Multimodal Matching-aware Co-attention Networks with Mutual Knowledge Distillation for Fake News Detection Dec 12, 2022 Fake News Detection Image-text matching
— Unverified 0MOBA: Multi-teacher Model Based Reinforcement Learning Sep 29, 2021 Decision Making Knowledge Distillation
— Unverified 0MobileVOS: Real-Time Video Object Segmentation Contrastive Learning meets Knowledge Distillation Mar 14, 2023 Contrastive Learning Knowledge Distillation
— Unverified 0Modality-Inconsistent Continual Learning of Multimodal Large Language Models Dec 17, 2024 Continual Learning Knowledge Distillation
— Unverified 0ModalityMirror: Improving Audio Classification in Modality Heterogeneity Federated Learning with Multimodal Distillation Aug 28, 2024 Audio Classification Federated Learning
— Unverified 0MSD: Saliency-aware Knowledge Distillation for Multimodal Understanding Jan 6, 2021 Knowledge Distillation Meta-Learning
— Unverified 0Modality-specific Distillation Jun 1, 2021 Knowledge Distillation Meta-Learning
— Unverified 0Model-Agnostic Decentralized Collaborative Learning for On-Device POI Recommendation Apr 8, 2023 Knowledge Distillation Privacy Preserving
— Unverified 0Model Compression and Efficient Inference for Large Language Models: A Survey Feb 15, 2024 Knowledge Distillation Model Compression
— Unverified 0Model compression for faster structural separation of macromolecules captured by Cellular Electron Cryo-Tomography Jan 31, 2018 Classification General Classification
— Unverified 0