Model Compression for Resource-Constrained Mobile Robots Jul 20, 2022 Knowledge Distillation model
— Unverified 0Model Compression Methods for YOLOv5: A Review Jul 21, 2023 Knowledge Distillation model
— Unverified 0Model compression using knowledge distillation with integrated gradients Jun 17, 2025 Data Augmentation Knowledge Distillation
— Unverified 0Model Compression Using Optimal Transport Dec 7, 2020 image-classification Image Classification
— Unverified 0Model Compression with Multi-Task Knowledge Distillation for Web-scale Question Answering System Apr 21, 2019 Knowledge Distillation Model Compression
— Unverified 0Model Compression with Two-stage Multi-teacher Knowledge Distillation for Web Question Answering System Oct 18, 2019 General Knowledge Knowledge Distillation
— Unverified 0Model Distillation for Faithful Explanations of Medical Code Predictions May 1, 2022 Decision Making Knowledge Distillation
— Unverified 0Model Distillation with Knowledge Transfer from Face Classification to Alignment and Verification Sep 9, 2017 Classification Face Recognition
— Unverified 0On Cross-Layer Alignment for Model Fusion of Heterogeneous Neural Networks Oct 29, 2021 Knowledge Distillation Model Compression
— Unverified 0A Light-weight Deep Human Activity Recognition Algorithm Using Multi-knowledge Distillation Jul 6, 2021 Activity Recognition Classification
— Unverified 0Modeling Teacher-Student Techniques in Deep Neural Networks for Knowledge Distillation Dec 31, 2019 Knowledge Distillation
— Unverified 0Model Mimic Attack: Knowledge Distillation for Provably Transferable Adversarial Examples Oct 21, 2024 Knowledge Distillation
— Unverified 0Out of Thin Air: Exploring Data-Free Adversarial Robustness Distillation Mar 21, 2023 Adversarial Robustness Knowledge Distillation
— Unverified 0Model Stitching by Functional Latent Alignment May 26, 2025 Knowledge Distillation model
— Unverified 0Modifying Final Splits of Classification Tree for Fine-tuning Subpopulation Target in Policy Making Feb 20, 2025 Knowledge Distillation
— Unverified 0Modular Transformers: Compressing Transformers into Modularized Layers for Flexible Efficient Inference Jun 4, 2023 Decoder Knowledge Distillation
— Unverified 0MoEBERT: from BERT to Mixture-of-Experts via Importance-Guided Adaptation Jan 16, 2022 Knowledge Distillation Mixture-of-Experts
— Unverified 0MoE-Pruner: Pruning Mixture-of-Experts Large Language Model using the Hints from Its Router Oct 15, 2024 Knowledge Distillation Language Modeling
— Unverified 0MoKD: Multi-Task Optimization for Knowledge Distillation May 13, 2025 image-classification Image Classification
— Unverified 0MoLe-VLA: Dynamic Layer-skipping Vision Language Action Model via Mixture-of-Layers for Efficient Robot Manipulation Mar 26, 2025 Knowledge Distillation Mixture-of-Experts
— Unverified 0Momentum Adversarial Distillation: Handling Large Distribution Shifts in Data-Free Knowledge Distillation Sep 21, 2022 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 0Mono2Stereo: Monocular Knowledge Transfer for Enhanced Stereo Matching Nov 14, 2024 Depth Estimation Knowledge Distillation
— Unverified 0More From Less: Self-Supervised Knowledge Distillation for Routine Histopathology Data Mar 19, 2023 Knowledge Distillation
— Unverified 0Motion Pyramid Networks for Accurate and Efficient Cardiac Motion Estimation Jun 28, 2020 Knowledge Distillation Motion Estimation
— Unverified 0MoVE-KD: Knowledge Distillation for VLMs with Mixture of Visual Encoders Jan 3, 2025 Knowledge Distillation Mixture-of-Experts
— Unverified 0MS-KD: Multi-Organ Segmentation with Multiple Binary-Labeled Datasets Aug 5, 2021 Knowledge Distillation Organ Segmentation
— Unverified 0MT2KD: Towards A General-Purpose Encoder for Speech, Speaker, and Audio Events Sep 25, 2024 Audio Tagging Automatic Speech Recognition
— Unverified 0MTKD: Multi-Teacher Knowledge Distillation for Image Super-Resolution Apr 15, 2024 Image Super-Resolution Knowledge Distillation
— Unverified 0MulDE: Multi-teacher Knowledge Distillation for Low-dimensional Knowledge Graph Embeddings Oct 14, 2020 Graph Embedding Knowledge Distillation
— Unverified 0Multi-adversarial Faster-RCNN with Paradigm Teacher for Unrestricted Object Detection Dec 11, 2022 Domain Adaptation Knowledge Distillation
— Unverified 0Multi-Branch Mutual-Distillation Transformer for EEG-Based Seizure Subtype Classification Dec 4, 2024 EEG Electroencephalogram (EEG)
— Unverified 0Multi-Channel Multi-Domain based Knowledge Distillation Algorithm for Sleep Staging with Single-Channel EEG Jan 7, 2024 EEG Knowledge Distillation
— Unverified 0Cultural Commonsense Knowledge for Intercultural Dialogues Feb 16, 2024 Knowledge Distillation Specificity
— Unverified 0Multi-Document Financial Question Answering using LLMs Nov 8, 2024 Knowledge Distillation Knowledge Graphs
— Unverified 0Multi-Frame Self-Supervised Depth Estimation with Multi-Scale Feature Fusion in Dynamic Scenes Mar 26, 2023 Depth Estimation Knowledge Distillation
— Unverified 0Multi-Frame to Single-Frame: Knowledge Distillation for 3D Object Detection Sep 24, 2020 3D Object Detection Autonomous Driving
— Unverified 0Multi-Grained Knowledge Distillation for Named Entity Recognition Jun 1, 2021 Knowledge Distillation named-entity-recognition
— Unverified 0Multi-Granularity Contrastive Knowledge Distillation for Multimodal Named Entity Recognition Nov 16, 2021 Knowledge Distillation Multi-modal Named Entity Recognition
— Unverified 0Multi-Granularity Semantic Revision for Large Language Model Distillation Jul 14, 2024 Knowledge Distillation Language Modeling
— Unverified 0Multi-head Knowledge Distillation for Model Compression Dec 5, 2020 image-classification Image Classification
— Unverified 0Multi-label Class Incremental Emotion Decoding with Augmented Emotional Semantics Learning May 31, 2024 class-incremental learning Class Incremental Learning
— Unverified 0Multi-label Contrastive Predictive Coding Jul 20, 2020 Knowledge Distillation Multi-class Classification
— Unverified 0Multi-label Emotion Analysis in Conversation via Multimodal Knowledge Distillation Oct 27, 2023 Emotion Recognition Knowledge Distillation
— Unverified 0Multi-level Distillation of Semantic Knowledge for Pre-training Multilingual Language Model Nov 2, 2022 Knowledge Distillation Language Modeling
— Unverified 0Multilingual Neural Machine Translation:Can Linguistic Hierarchies Help? Oct 15, 2021 Knowledge Distillation Machine Translation
— Unverified 0Multilingual Neural Machine Translation: Can Linguistic Hierarchies Help? Nov 1, 2021 Knowledge Distillation Machine Translation
— Unverified 0Multi-MLLM Knowledge Distillation for Out-of-Context News Detection May 28, 2025 Knowledge Distillation Misinformation
— Unverified 0Multimodal Commonsense Knowledge Distillation for Visual Question Answering Nov 5, 2024 Knowledge Distillation Question Answering
— Unverified 0Multi-modal Cross-domain Self-supervised Pre-training for fMRI and EEG Fusion Sep 27, 2024 Data Augmentation EEG
— Unverified 0Multi-Modal Few-Shot Object Detection with Meta-Learning-Based Cross-Modal Prompting Apr 16, 2022 Few-Shot Learning Few-Shot Object Detection
— Unverified 0