Leveraging Advantages of Interactive and Non-Interactive Models for Vector-Based Cross-Lingual Information Retrieval Nov 3, 2021 Computational Efficiency Cross-Lingual Information Retrieval
— Unverified 0Leveraging Angular Distributions for Improved Knowledge Distillation Feb 27, 2023 Knowledge Distillation
— Unverified 0Leveraging ASR Pretrained Conformers for Speaker Verification through Transfer Learning and Knowledge Distillation Sep 6, 2023 Knowledge Distillation Speaker Verification
— Unverified 0Leveraging Conditional Mutual Information to Improve Large Language Model Fine-Tuning For Classification Feb 16, 2025 Classification image-classification
— Unverified 0Leveraging Different Learning Styles for Improved Knowledge Distillation in Biomedical Imaging Dec 6, 2022 Knowledge Distillation Model Compression
— Unverified 0Leveraging Expert Models for Training Deep Neural Networks in Scarce Data Domains: Application to Offline Handwritten Signature Verification Aug 2, 2023 Knowledge Distillation
— Unverified 0FTSmartAudit: A Knowledge Distillation-Enhanced Framework for Automated Smart Contract Auditing Using Fine-Tuned LLMs Oct 17, 2024 Dataset Generation Knowledge Distillation
— Unverified 0Leveraging Foundation Models To learn the shape of semi-fluid deformable objects Nov 25, 2024 Knowledge Distillation Object
— Unverified 0Leveraging Knowledge Distillation for Lightweight Skin Cancer Classification: Balancing Accuracy and Computational Efficiency Jun 24, 2024 Cancer Classification Computational Efficiency
— Unverified 0Leveraging Large Language Models for Enhanced NLP Task Performance through Knowledge Distillation and Optimized Training Strategies Feb 14, 2024 Knowledge Distillation named-entity-recognition
— Unverified 0Leveraging Recent Advances in Deep Learning for Audio-Visual Emotion Recognition Mar 16, 2021 Deep Learning Emotion Recognition
— Unverified 0Li3DeTr: A LiDAR based 3D Detection Transformer Oct 27, 2022 Autonomous Driving Decoder
— Unverified 0Life-Code: Central Dogma Modeling with Multi-Omics Sequence Unification Feb 11, 2025 Knowledge Distillation
— Unverified 0Lifelong GAN: Continual Learning for Conditional Image Generation Jul 23, 2019 Conditional Image Generation Continual Learning
— Unverified 0Lifelong Intent Detection via Multi-Strategy Rebalancing Aug 10, 2021 Intent Detection Knowledge Distillation
— Unverified 0Life-long Learning for Multilingual Neural Machine Translation with Knowledge Distillation Dec 6, 2022 Knowledge Distillation Machine Translation
— Unverified 0Lifelong Learning for Neural powered Mixed Integer Programming Aug 24, 2022 Graph Attention Knowledge Distillation
— Unverified 0Lifelong Learning via Progressive Distillation and Retrospection Sep 1, 2018 Knowledge Distillation Lifelong learning
— Unverified 0Lifelong Object Detection Sep 2, 2020 Knowledge Distillation Lifelong learning
— Unverified 0Lifelong Person Search Jul 31, 2024 Knowledge Distillation Person Search
— Unverified 0Lifelong Twin Generative Adversarial Networks Jul 9, 2021 Knowledge Distillation
— Unverified 0Lifelong Unsupervised Domain Adaptive Person Re-identification with Coordinated Anti-forgetting and Adaptation Dec 13, 2021 Domain Adaptive Person Re-Identification Knowledge Distillation
— Unverified 0LightBTSeg: A lightweight breast tumor segmentation model using ultrasound images via dual-path joint knowledge distillation Nov 18, 2023 Knowledge Distillation Lesion Detection
— Unverified 0Light distillation for Incremental Graph Convolution Collaborative Filtering May 26, 2025 Collaborative Filtering Knowledge Distillation
— Unverified 0LightPAFF: A Two-Stage Distillation Framework for Pre-training and Fine-tuning Apr 27, 2020 Knowledge Distillation Language Modeling
— Unverified 0LightVessel: Exploring Lightweight Coronary Artery Vessel Segmentation via Similarity Knowledge Distillation Nov 2, 2022 Decoder Knowledge Distillation
— Unverified 0Lightweight 3D Human Pose Estimation Network Training Using Teacher-Student Learning Jan 15, 2020 3D Human Pose Estimation 3D Pose Estimation
— Unverified 0Lightweight Contrastive Distilled Hashing for Online Cross-modal Retrieval Feb 27, 2025 Cross-Modal Retrieval Knowledge Distillation
— Unverified 0Lightweight Modality Adaptation to Sequential Recommendation via Correlation Supervision Jan 14, 2024 Knowledge Distillation Representation Learning
— Unverified 0Lightweight Neural Network with Knowledge Distillation for CSI Feedback Oct 31, 2022 Knowledge Distillation
— Unverified 0Lightweight Sound Event Detection Model with RepVGG Architecture Nov 1, 2022 Event Detection Knowledge Distillation
— Unverified 0Lightweight Task-Oriented Semantic Communication Empowered by Large-Scale AI Models Jun 16, 2025 Knowledge Distillation Semantic Communication
— Unverified 0Limitations of Knowledge Distillation for Zero-shot Transfer Learning Nov 1, 2021 CPU Cross-Lingual Transfer
— Unverified 0Linear Projections of Teacher Embeddings for Few-Class Distillation Sep 30, 2024 Binary Classification Knowledge Distillation
— Unverified 0Linkless Link Prediction via Relational Distillation Oct 11, 2022 Knowledge Distillation Link Prediction
— Unverified 0Lip-Listening: Mixing Senses to Understand Lips using Cross Modality Knowledge Distillation for Word-Based Models Jun 5, 2022 Knowledge Distillation Lipreading
— Unverified 0Lipschitz Continuity Guided Knowledge Distillation Aug 29, 2021 Knowledge Distillation Model Compression
— Unverified 0ListBERT: Learning to Rank E-commerce products with Listwise BERT Jun 30, 2022 Knowledge Distillation Learning-To-Rank
— Unverified 0LIT: Block-wise Intermediate Representation Training for Model Compression Oct 2, 2018 Knowledge Distillation Model Compression
— Unverified 0LiT: Delving into a Simplified Linear Diffusion Transformer for Image Generation Jan 22, 2025 Image Generation Knowledge Distillation
— Unverified 0LIX: Implicitly Infusing Spatial Geometric Prior Knowledge into Visual Semantic Segmentation for Autonomous Driving Mar 13, 2024 Autonomous Driving Knowledge Distillation
— Unverified 0Llama-Nemotron: Efficient Reasoning Models May 2, 2025 Knowledge Distillation Neural Architecture Search
— Unverified 0LLAVADI: What Matters For Multimodal Large Language Models Distillation Jul 28, 2024 Knowledge Distillation
— Unverified 0LLaVA-Ultra: Large Chinese Language and Vision Assistant for Ultrasound Oct 19, 2024 Instruction Following Knowledge Distillation
— Unverified 0LLM-based Privacy Data Augmentation Guided by Knowledge Distillation with a Distribution Tutor for Medical Text Classification Feb 26, 2024 Data Augmentation Knowledge Distillation
— Unverified 0LLM Distillation for Efficient Few-Shot Multiple Choice Question Answering Dec 13, 2024 Few-Shot Learning Knowledge Distillation
— Unverified 0LLM-driven Knowledge Distillation for Dynamic Text-Attributed Graphs Feb 15, 2025 Edge Classification Knowledge Distillation
— Unverified 0LLM Pretraining with Continuous Concepts Feb 12, 2025 Knowledge Distillation Language Modeling
— Unverified 0LLM-RadJudge: Achieving Radiologist-Level Evaluation for X-Ray Report Generation Apr 1, 2024 Knowledge Distillation
— Unverified 0LLMR: Knowledge Distillation with a Large Language Model-Induced Reward Sep 19, 2024 Dialogue Generation Knowledge Distillation
— Unverified 0