Minimally Invasive Surgery for Sparse Neural Networks in Contrastive Manner Jun 19, 2021 Knowledge Distillation Model Compression
— Unverified 00 Mini-ResEmoteNet: Leveraging Knowledge Distillation for Human-Centered Design Jan 30, 2025 Emotion Recognition Facial Emotion Recognition
— Unverified 00 MiniVLN: Efficient Vision-and-Language Navigation by Progressive Knowledge Distillation Sep 27, 2024 Knowledge Distillation Vision and Language Navigation
— Unverified 00 MinT: Boosting Generalization in Mathematical Reasoning via Multi-View Fine-Tuning Jul 16, 2023 Knowledge Distillation Mathematical Reasoning
— Unverified 00 Mitigating Cross-client GANs-based Attack in Federated Learning Jul 25, 2023 Data-free Knowledge Distillation Federated Learning
— Unverified 00 Mitigating Gender Bias in Distilled Language Models via Counterfactual Role Reversal Mar 23, 2022 counterfactual Fairness
— Unverified 00 Mitigating Hallucination with ZeroG: An Advanced Knowledge Management Engine Nov 8, 2024 Computational Efficiency Hallucination
— Unverified 00 Mixed Distillation Helps Smaller Language Model Better Reasoning Dec 17, 2023 Knowledge Distillation Language Modeling
— Unverified 00 Mixed-Type Wafer Classification For Low Memory Devices Using Knowledge Distillation Mar 24, 2023 Knowledge Distillation Lightweight Deployment
— Unverified 00 MixKD: Towards Efficient Distillation of Large-scale Language Models Nov 1, 2020 Data Augmentation Knowledge Distillation
— Unverified 00 A Guide To Effectively Leveraging LLMs for Low-Resource Text Summarization: Data Augmentation and Semi-supervised Approaches Jul 10, 2024 Abstractive Text Summarization Data Augmentation
— Unverified 00 MKF-ADS: Multi-Knowledge Fusion Based Self-supervised Anomaly Detection System for Control Area Network Mar 7, 2024 Anomaly Detection Intrusion Detection
— Unverified 00 MK-SGN: A Spiking Graph Convolutional Network with Multimodal Fusion and Knowledge Distillation for Skeleton-based Action Recognition Apr 16, 2024 Action Recognition Knowledge Distillation
— Unverified 00 MLKD-BERT: Multi-level Knowledge Distillation for Pre-trained Language Models Jul 3, 2024 Extractive Question-Answering Knowledge Distillation
— Unverified 00 Multimodal Matching-aware Co-attention Networks with Mutual Knowledge Distillation for Fake News Detection Dec 12, 2022 Fake News Detection Image-text matching
— Unverified 00 MOBA: Multi-teacher Model Based Reinforcement Learning Sep 29, 2021 Decision Making Knowledge Distillation
— Unverified 00 MobileVOS: Real-Time Video Object Segmentation Contrastive Learning meets Knowledge Distillation Mar 14, 2023 Contrastive Learning Knowledge Distillation
— Unverified 00 Modality-Inconsistent Continual Learning of Multimodal Large Language Models Dec 17, 2024 Continual Learning Knowledge Distillation
— Unverified 00 ModalityMirror: Improving Audio Classification in Modality Heterogeneity Federated Learning with Multimodal Distillation Aug 28, 2024 Audio Classification Federated Learning
— Unverified 00 MSD: Saliency-aware Knowledge Distillation for Multimodal Understanding Jan 6, 2021 Knowledge Distillation Meta-Learning
— Unverified 00 Modality-specific Distillation Jun 1, 2021 Knowledge Distillation Meta-Learning
— Unverified 00 Model-Agnostic Decentralized Collaborative Learning for On-Device POI Recommendation Apr 8, 2023 Knowledge Distillation Privacy Preserving
— Unverified 00 Model Compression and Efficient Inference for Large Language Models: A Survey Feb 15, 2024 Knowledge Distillation Model Compression
— Unverified 00 Model compression for faster structural separation of macromolecules captured by Cellular Electron Cryo-Tomography Jan 31, 2018 Classification General Classification
— Unverified 00 Model Compression for Resource-Constrained Mobile Robots Jul 20, 2022 Knowledge Distillation model
— Unverified 00 Model Compression Methods for YOLOv5: A Review Jul 21, 2023 Knowledge Distillation model
— Unverified 00 Model compression using knowledge distillation with integrated gradients Jun 17, 2025 Data Augmentation Knowledge Distillation
— Unverified 00 Model Compression Using Optimal Transport Dec 7, 2020 image-classification Image Classification
— Unverified 00 Model Compression with Multi-Task Knowledge Distillation for Web-scale Question Answering System Apr 21, 2019 Knowledge Distillation Model Compression
— Unverified 00 Model Compression with Two-stage Multi-teacher Knowledge Distillation for Web Question Answering System Oct 18, 2019 General Knowledge Knowledge Distillation
— Unverified 00 Model Distillation for Faithful Explanations of Medical Code Predictions May 1, 2022 Decision Making Knowledge Distillation
— Unverified 00 Model Distillation with Knowledge Transfer from Face Classification to Alignment and Verification Sep 9, 2017 Classification Face Recognition
— Unverified 00 On Cross-Layer Alignment for Model Fusion of Heterogeneous Neural Networks Oct 29, 2021 Knowledge Distillation Model Compression
— Unverified 00 A Light-weight Deep Human Activity Recognition Algorithm Using Multi-knowledge Distillation Jul 6, 2021 Activity Recognition Classification
— Unverified 00 Modeling Teacher-Student Techniques in Deep Neural Networks for Knowledge Distillation Dec 31, 2019 Knowledge Distillation
— Unverified 00 Model Mimic Attack: Knowledge Distillation for Provably Transferable Adversarial Examples Oct 21, 2024 Knowledge Distillation
— Unverified 00 Out of Thin Air: Exploring Data-Free Adversarial Robustness Distillation Mar 21, 2023 Adversarial Robustness Knowledge Distillation
— Unverified 00 Model Stitching by Functional Latent Alignment May 26, 2025 Knowledge Distillation model
— Unverified 00 Modifying Final Splits of Classification Tree for Fine-tuning Subpopulation Target in Policy Making Feb 20, 2025 Knowledge Distillation
— Unverified 00 Modular Transformers: Compressing Transformers into Modularized Layers for Flexible Efficient Inference Jun 4, 2023 Decoder Knowledge Distillation
— Unverified 00 MoEBERT: from BERT to Mixture-of-Experts via Importance-Guided Adaptation Jan 16, 2022 Knowledge Distillation Mixture-of-Experts
— Unverified 00 MoE-Pruner: Pruning Mixture-of-Experts Large Language Model using the Hints from Its Router Oct 15, 2024 Knowledge Distillation Language Modeling
— Unverified 00 MoKD: Multi-Task Optimization for Knowledge Distillation May 13, 2025 image-classification Image Classification
— Unverified 00 MoLe-VLA: Dynamic Layer-skipping Vision Language Action Model via Mixture-of-Layers for Efficient Robot Manipulation Mar 26, 2025 Knowledge Distillation Mixture-of-Experts
— Unverified 00 Momentum Adversarial Distillation: Handling Large Distribution Shifts in Data-Free Knowledge Distillation Sep 21, 2022 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 00 Mono2Stereo: Monocular Knowledge Transfer for Enhanced Stereo Matching Nov 14, 2024 Depth Estimation Knowledge Distillation
— Unverified 00 More From Less: Self-Supervised Knowledge Distillation for Routine Histopathology Data Mar 19, 2023 Knowledge Distillation
— Unverified 00 Motion Pyramid Networks for Accurate and Efficient Cardiac Motion Estimation Jun 28, 2020 Knowledge Distillation Motion Estimation
— Unverified 00 MoVE-KD: Knowledge Distillation for VLMs with Mixture of Visual Encoders Jan 3, 2025 Knowledge Distillation Mixture-of-Experts
— Unverified 00 MS-KD: Multi-Organ Segmentation with Multiple Binary-Labeled Datasets Aug 5, 2021 Knowledge Distillation Organ Segmentation
— Unverified 00