Local Correlation Consistency for Knowledge Distillation Aug 1, 2020 Knowledge Distillation
— Unverified 0LoCa: Logit Calibration for Knowledge Distillation Sep 7, 2024 image-classification Image Classification
— Unverified 0Locally Linear Region Knowledge Distillation Oct 9, 2020 Knowledge Distillation
— Unverified 0Local-Selective Feature Distillation for Single Image Super-Resolution Nov 22, 2021 Image Super-Resolution Knowledge Distillation
— Unverified 0Local-to-Global Self-Supervised Representation Learning for Diabetic Retinopathy Grading Oct 1, 2024 Diabetic Retinopathy Grading image-classification
— Unverified 0Local vs. Global: Local Land-Use and Land-Cover Models Deliver Higher Quality Maps Dec 1, 2024 Earth Observation Knowledge Distillation
— Unverified 0Logic Distillation: Learning from Code Function by Function for Planning and Decision-making Jul 28, 2024 Decision Making Knowledge Distillation
— Unverified 0Logits Poisoning Attack in Federated Distillation Jan 8, 2024 Federated Learning Knowledge Distillation
— Unverified 0LokiLM: Technical Report Jul 10, 2024 Knowledge Distillation Language Modeling
— Unverified 0Long Live the Lottery: The Existence of Winning Tickets in Lifelong Learning Jan 1, 2021 class-incremental learning Class Incremental Learning
— Unverified 0Long-Range Zero-Shot Generative Deep Network Quantization Nov 13, 2022 Knowledge Distillation Quantization
— Unverified 0Long-Tailed Continual Learning For Visual Food Recognition Jul 1, 2023 Continual Learning Data Augmentation
— Unverified 0Long-tailed Food Classification Oct 26, 2022 Classification Data Augmentation
— Unverified 0Hierarchical Knowledge Guided Learning for Real-world Retinal Diseases Recognition Nov 17, 2021 Knowledge Distillation
— Unverified 0Long-Tailed Question Answering in an Open World May 11, 2023 Knowledge Distillation Language Modelling
— Unverified 0Long-Term Vehicle Localization by Recursive Knowledge Distillation Apr 7, 2019 Domain Adaptation Ensemble Learning
— Unverified 0LookALike: Human Mimicry based collaborative decision making Mar 16, 2024 Decision Making Knowledge Distillation
— Unverified 0Look Backward and Forward: Self-Knowledge Distillation with Bidirectional Decoder for Neural Machine Translation Mar 10, 2022 Decoder Knowledge Distillation
— Unverified 0Look One and More: Distilling Hybrid Order Relational Knowledge for Cross-Resolution Image Recognition Sep 9, 2024 Face Recognition image-classification
— Unverified 0Lost in Distillation: A Case Study in Toxicity Modeling Jul 1, 2022 Knowledge Distillation
— Unverified 0Low-Complexity Inference in Continual Learning via Compressed Knowledge Transfer May 13, 2025 class-incremental learning Class Incremental Learning
— Unverified 0Low-Dimensional Federated Knowledge Graph Embedding via Knowledge Distillation Aug 11, 2024 Graph Embedding Knowledge Distillation
— Unverified 0Low-Latency Incremental Text-to-Speech Synthesis with Distilled Context Prediction Network Sep 22, 2021 Knowledge Distillation Language Modeling
— Unverified 0Low-Resolution Chest X-ray Classification via Knowledge Distillation and Multi-task Learning May 22, 2024 Diagnostic Knowledge Distillation
— Unverified 0Low-resolution Face Recognition in the Wild via Selective Knowledge Distillation Nov 25, 2018 CPU Face Model
— Unverified 0Low-Resolution Face Recognition via Adaptable Instance-Relation Distillation Sep 3, 2024 Face Recognition Knowledge Distillation
— Unverified 0Low-Resolution Object Recognition with Cross-Resolution Relational Contrastive Distillation Sep 4, 2024 Face Recognition Knowledge Distillation
— Unverified 0Low Resource Causal Event Detection from Biomedical Literature May 1, 2022 Event Detection Knowledge Distillation
— Unverified 0Low-resource Low-footprint Wake-word Detection using Knowledge Distillation Jul 6, 2022 Knowledge Distillation speech-recognition
— Unverified 0LRC-BERT: Latent-representation Contrastive Knowledge Distillation for Natural Language Understanding Dec 14, 2020 Contrastive Learning Knowledge Distillation
— Unverified 0LRSpeech: Extremely Low-Resource Speech Synthesis and Recognition Aug 9, 2020 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0LTD: Low Temperature Distillation for Robust Adversarial Training Nov 3, 2021 Knowledge Distillation
— Unverified 0M2KD: Multi-model and Multi-level Knowledge Distillation for Incremental Learning Apr 3, 2019 Incremental Learning Knowledge Distillation
— Unverified 0MadEye: Boosting Live Video Analytics Accuracy with Adaptive Camera Configurations Apr 4, 2023 Knowledge Distillation
— Unverified 0Making Neural Machine Reading Comprehension Faster Mar 29, 2019 Knowledge Distillation Machine Reading Comprehension
— Unverified 0Making Small Language Models Better Few-Shot Learners Nov 16, 2021 Few-Shot Learning Knowledge Distillation
— Unverified 0Mamba base PKD for efficient knowledge compression Mar 3, 2025 image-classification Image Classification
— Unverified 0MambaLiteSR: Image Super-Resolution with Low-Rank Mamba using Knowledge Distillation Feb 19, 2025 Image Super-Resolution Knowledge Distillation
— Unverified 0Many-to-One Knowledge Distillation of Real-Time Epileptic Seizure Detection for Low-Power Wearable Internet of Things Systems Jul 20, 2022 Edge-computing Knowledge Distillation
— Unverified 0MapDistill: Boosting Efficient Camera-based HD Map Construction via Camera-LiDAR Fusion Model Distillation Jul 16, 2024 Autonomous Driving Knowledge Distillation
— Unverified 0Map-Free Trajectory Prediction with Map Distillation and Hierarchical Encoding Nov 17, 2024 Autonomous Vehicles Decoder
— Unverified 0Marine Saliency Segmenter: Object-Focused Conditional Diffusion with Region-Level Semantic Knowledge Distillation Apr 3, 2025 Knowledge Distillation Segmentation
— Unverified 0Markowitz Meets Bellman: Knowledge-distilled Reinforcement Learning for Portfolio Management May 8, 2024 Knowledge Distillation Management
— Unverified 0Masked Autoencoders Are Stronger Knowledge Distillers Jan 1, 2023 Decoder Knowledge Distillation
— Unverified 0The Role of Masking for Efficient Supervised Knowledge Distillation of Vision Transformers Feb 21, 2023 Knowledge Distillation
— Unverified 0Masked Modeling Duo for Speech: Specializing General-Purpose Audio Representation to Speech using Denoising Distillation May 23, 2023 Denoising Knowledge Distillation
— Unverified 0Matching Distributions between Model and Data: Cross-domain Knowledge Distillation for Unsupervised Domain Adaptation Aug 1, 2021 Cross-Domain Text Classification Domain Adaptation
— Unverified 0Maximizing Discrimination Capability of Knowledge Distillation with Energy Function Nov 24, 2023 Data Augmentation Knowledge Distillation
— Unverified 0Maximum Likelihood Distillation for Robust Modulation Classification Nov 1, 2022 Classification Knowledge Distillation
— Unverified 0MCF-VC: Mitigate Catastrophic Forgetting in Class-Incremental Learning for Multimodal Video Captioning Feb 27, 2024 class-incremental learning Class Incremental Learning
— Unverified 0