Lightweight 3D Human Pose Estimation Network Training Using Teacher-Student Learning Jan 15, 2020 3D Human Pose Estimation 3D Pose Estimation
— Unverified 00 Lightweight Contrastive Distilled Hashing for Online Cross-modal Retrieval Feb 27, 2025 Cross-Modal Retrieval Knowledge Distillation
— Unverified 00 Lightweight Modality Adaptation to Sequential Recommendation via Correlation Supervision Jan 14, 2024 Knowledge Distillation Representation Learning
— Unverified 00 Lightweight Neural Network with Knowledge Distillation for CSI Feedback Oct 31, 2022 Knowledge Distillation
— Unverified 00 Lightweight Sound Event Detection Model with RepVGG Architecture Nov 1, 2022 Event Detection Knowledge Distillation
— Unverified 00 Lightweight Task-Oriented Semantic Communication Empowered by Large-Scale AI Models Jun 16, 2025 Knowledge Distillation Semantic Communication
— Unverified 00 Limitations of Knowledge Distillation for Zero-shot Transfer Learning Nov 1, 2021 CPU Cross-Lingual Transfer
— Unverified 00 Linear Projections of Teacher Embeddings for Few-Class Distillation Sep 30, 2024 Binary Classification Knowledge Distillation
— Unverified 00 Linkless Link Prediction via Relational Distillation Oct 11, 2022 Knowledge Distillation Link Prediction
— Unverified 00 Lip-Listening: Mixing Senses to Understand Lips using Cross Modality Knowledge Distillation for Word-Based Models Jun 5, 2022 Knowledge Distillation Lipreading
— Unverified 00 Lipschitz Continuity Guided Knowledge Distillation Aug 29, 2021 Knowledge Distillation Model Compression
— Unverified 00 ListBERT: Learning to Rank E-commerce products with Listwise BERT Jun 30, 2022 Knowledge Distillation Learning-To-Rank
— Unverified 00 LIT: Block-wise Intermediate Representation Training for Model Compression Oct 2, 2018 Knowledge Distillation Model Compression
— Unverified 00 LiT: Delving into a Simplified Linear Diffusion Transformer for Image Generation Jan 22, 2025 Image Generation Knowledge Distillation
— Unverified 00 LIX: Implicitly Infusing Spatial Geometric Prior Knowledge into Visual Semantic Segmentation for Autonomous Driving Mar 13, 2024 Autonomous Driving Knowledge Distillation
— Unverified 00 Llama-Nemotron: Efficient Reasoning Models May 2, 2025 Knowledge Distillation Neural Architecture Search
— Unverified 00 LLAVADI: What Matters For Multimodal Large Language Models Distillation Jul 28, 2024 Knowledge Distillation
— Unverified 00 LLaVA-Ultra: Large Chinese Language and Vision Assistant for Ultrasound Oct 19, 2024 Instruction Following Knowledge Distillation
— Unverified 00 LLM-based Privacy Data Augmentation Guided by Knowledge Distillation with a Distribution Tutor for Medical Text Classification Feb 26, 2024 Data Augmentation Knowledge Distillation
— Unverified 00 LLM Distillation for Efficient Few-Shot Multiple Choice Question Answering Dec 13, 2024 Few-Shot Learning Knowledge Distillation
— Unverified 00 LLM-driven Knowledge Distillation for Dynamic Text-Attributed Graphs Feb 15, 2025 Edge Classification Knowledge Distillation
— Unverified 00 LLM Pretraining with Continuous Concepts Feb 12, 2025 Knowledge Distillation Language Modeling
— Unverified 00 LLM-RadJudge: Achieving Radiologist-Level Evaluation for X-Ray Report Generation Apr 1, 2024 Knowledge Distillation
— Unverified 00 LLMR: Knowledge Distillation with a Large Language Model-Induced Reward Sep 19, 2024 Dialogue Generation Knowledge Distillation
— Unverified 00 Local Correlation Consistency for Knowledge Distillation Aug 1, 2020 Knowledge Distillation
— Unverified 00 LoCa: Logit Calibration for Knowledge Distillation Sep 7, 2024 image-classification Image Classification
— Unverified 00 Locally Linear Region Knowledge Distillation Oct 9, 2020 Knowledge Distillation
— Unverified 00 Local-Selective Feature Distillation for Single Image Super-Resolution Nov 22, 2021 Image Super-Resolution Knowledge Distillation
— Unverified 00 Local-to-Global Self-Supervised Representation Learning for Diabetic Retinopathy Grading Oct 1, 2024 Diabetic Retinopathy Grading image-classification
— Unverified 00 Local vs. Global: Local Land-Use and Land-Cover Models Deliver Higher Quality Maps Dec 1, 2024 Earth Observation Knowledge Distillation
— Unverified 00 Logic Distillation: Learning from Code Function by Function for Planning and Decision-making Jul 28, 2024 Decision Making Knowledge Distillation
— Unverified 00 Logits Poisoning Attack in Federated Distillation Jan 8, 2024 Federated Learning Knowledge Distillation
— Unverified 00 LokiLM: Technical Report Jul 10, 2024 Knowledge Distillation Language Modeling
— Unverified 00 Long Live the Lottery: The Existence of Winning Tickets in Lifelong Learning Jan 1, 2021 class-incremental learning Class Incremental Learning
— Unverified 00 Long-Range Zero-Shot Generative Deep Network Quantization Nov 13, 2022 Knowledge Distillation Quantization
— Unverified 00 Long-Tailed Continual Learning For Visual Food Recognition Jul 1, 2023 Continual Learning Data Augmentation
— Unverified 00 Long-tailed Food Classification Oct 26, 2022 Classification Data Augmentation
— Unverified 00 Hierarchical Knowledge Guided Learning for Real-world Retinal Diseases Recognition Nov 17, 2021 Knowledge Distillation
— Unverified 00 Long-Tailed Question Answering in an Open World May 11, 2023 Knowledge Distillation Language Modelling
— Unverified 00 Long-Term Vehicle Localization by Recursive Knowledge Distillation Apr 7, 2019 Domain Adaptation Ensemble Learning
— Unverified 00 LookALike: Human Mimicry based collaborative decision making Mar 16, 2024 Decision Making Knowledge Distillation
— Unverified 00 Look Backward and Forward: Self-Knowledge Distillation with Bidirectional Decoder for Neural Machine Translation Mar 10, 2022 Decoder Knowledge Distillation
— Unverified 00 Look One and More: Distilling Hybrid Order Relational Knowledge for Cross-Resolution Image Recognition Sep 9, 2024 Face Recognition image-classification
— Unverified 00 Lost in Distillation: A Case Study in Toxicity Modeling Jul 1, 2022 Knowledge Distillation
— Unverified 00 Low-Complexity Inference in Continual Learning via Compressed Knowledge Transfer May 13, 2025 class-incremental learning Class Incremental Learning
— Unverified 00 Low-Dimensional Federated Knowledge Graph Embedding via Knowledge Distillation Aug 11, 2024 Graph Embedding Knowledge Distillation
— Unverified 00 Low-Latency Incremental Text-to-Speech Synthesis with Distilled Context Prediction Network Sep 22, 2021 Knowledge Distillation Language Modeling
— Unverified 00 Low-Resolution Chest X-ray Classification via Knowledge Distillation and Multi-task Learning May 22, 2024 Diagnostic Knowledge Distillation
— Unverified 00 Low-resolution Face Recognition in the Wild via Selective Knowledge Distillation Nov 25, 2018 CPU Face Model
— Unverified 00 Low-Resolution Face Recognition via Adaptable Instance-Relation Distillation Sep 3, 2024 Face Recognition Knowledge Distillation
— Unverified 00