Learning to Augment for Data-Scarce Domain BERT Knowledge Distillation Jan 20, 2021 Knowledge Distillation
— Unverified 00 Learning to Extract Attribute Value from Product via Question Answering: A Multi-task Approach Aug 20, 2020 Attribute Attribute Value Extraction
— Unverified 00 Learning to Project for Cross-Task Knowledge Distillation Mar 21, 2024 Depth Estimation Knowledge Distillation
— Unverified 00 Learning to reconstruct signals with inexact sensing operator via knowledge distillation Jan 18, 2025 Knowledge Distillation
— Unverified 00 Learning to Retain while Acquiring: Combating Distribution-Shift in Adversarial Data-Free Knowledge Distillation Feb 28, 2023 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 00 Learning to Specialize with Knowledge Distillation for Visual Question Answering Dec 1, 2018 General Classification General Knowledge
— Unverified 00 Learning to Teach with Student Feedback Sep 10, 2021 Knowledge Distillation
— Unverified 00 Learning to Teach with Student Feedback Nov 16, 2021 Knowledge Distillation
— Unverified 00 Learning ULMFiT and Self-Distillation with Calibration for Medical Dialogue System Jul 20, 2021 Decision Making Knowledge Distillation
— Unverified 00 Learning Using Generated Privileged Information by Text-to-Image Diffusion Models Sep 26, 2023 Classification Knowledge Distillation
— Unverified 00 Teaching What You Should Teach: A Data-Based Distillation Method Dec 11, 2022 Data Augmentation Knowledge Distillation
— Unverified 00 Learning with Less: Knowledge Distillation from Large Language Models via Unlabeled Data Nov 12, 2024 Knowledge Distillation
— Unverified 00 Learn Spelling from Teachers: Transferring Knowledge from Language Models to Sequence-to-Sequence Speech Recognition Jul 13, 2019 Knowledge Distillation Language Modeling
— Unverified 00 Learn to Talk via Proactive Knowledge Transfer Aug 23, 2020 de-en Knowledge Distillation
— Unverified 00 Leave No Knowledge Behind During Knowledge Distillation: Towards Practical and Effective Knowledge Distillation for Code-Switching ASR Using Realistic Data Jul 15, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 Legal-Tech Open Diaries: Lesson learned on how to develop and deploy light-weight models in the era of humongous Language Models Oct 24, 2022 Knowledge Distillation Model Compression
— Unverified 00 LegoDNN: Block-grained Scaling of Deep Neural Networks for Mobile Vision Dec 18, 2021 Knowledge Distillation Model Compression
— Unverified 00 LENS-XAI: Redefining Lightweight and Explainable Network Security through Knowledge Distillation and Variational Autoencoders for Scalable Intrusion Detection in Cybersecurity Jan 1, 2025 Computational Efficiency Intrusion Detection
— Unverified 00 Less is More: Efficient Brain-Inspired Learning for Autonomous Driving Trajectory Prediction Jul 9, 2024 Autonomous Driving Decision Making
— Unverified 00 Less or More From Teacher: Exploiting Trilateral Geometry For Knowledge Distillation Dec 22, 2023 Bilevel Optimization Click-Through Rate Prediction
— Unverified 00 Let Video Teaches You More: Video-to-Image Knowledge Distillation using DEtection TRansformer for Medical Video Lesion Detection Aug 26, 2024 Knowledge Distillation Lesion Detection
— Unverified 00 Letz Translate: Low-Resource Machine Translation for Luxembourgish Mar 2, 2023 Knowledge Distillation Machine Translation
— Unverified 00 Leukocyte Classification using Multimodal Architecture Enhanced by Knowledge Distillation Aug 17, 2022 Classification Knowledge Distillation
— Unverified 00 Leveraging Acoustic and Linguistic Embeddings from Pretrained speech and language Models for Intent Classification Feb 15, 2021 Classification General Classification
— Unverified 00 Leveraging Advantages of Interactive and Non-Interactive Models for Vector-Based Cross-Lingual Information Retrieval Nov 3, 2021 Computational Efficiency Cross-Lingual Information Retrieval
— Unverified 00 Leveraging Angular Distributions for Improved Knowledge Distillation Feb 27, 2023 Knowledge Distillation
— Unverified 00 Leveraging ASR Pretrained Conformers for Speaker Verification through Transfer Learning and Knowledge Distillation Sep 6, 2023 Knowledge Distillation Speaker Verification
— Unverified 00 Leveraging Conditional Mutual Information to Improve Large Language Model Fine-Tuning For Classification Feb 16, 2025 Classification image-classification
— Unverified 00 Leveraging Different Learning Styles for Improved Knowledge Distillation in Biomedical Imaging Dec 6, 2022 Knowledge Distillation Model Compression
— Unverified 00 Leveraging Expert Models for Training Deep Neural Networks in Scarce Data Domains: Application to Offline Handwritten Signature Verification Aug 2, 2023 Knowledge Distillation
— Unverified 00 FTSmartAudit: A Knowledge Distillation-Enhanced Framework for Automated Smart Contract Auditing Using Fine-Tuned LLMs Oct 17, 2024 Dataset Generation Knowledge Distillation
— Unverified 00 Leveraging Foundation Models To learn the shape of semi-fluid deformable objects Nov 25, 2024 Knowledge Distillation Object
— Unverified 00 Leveraging Knowledge Distillation for Lightweight Skin Cancer Classification: Balancing Accuracy and Computational Efficiency Jun 24, 2024 Cancer Classification Computational Efficiency
— Unverified 00 Leveraging Large Language Models for Enhanced NLP Task Performance through Knowledge Distillation and Optimized Training Strategies Feb 14, 2024 Knowledge Distillation named-entity-recognition
— Unverified 00 Leveraging Recent Advances in Deep Learning for Audio-Visual Emotion Recognition Mar 16, 2021 Deep Learning Emotion Recognition
— Unverified 00 Li3DeTr: A LiDAR based 3D Detection Transformer Oct 27, 2022 Autonomous Driving Decoder
— Unverified 00 Life-Code: Central Dogma Modeling with Multi-Omics Sequence Unification Feb 11, 2025 Knowledge Distillation
— Unverified 00 Lifelong GAN: Continual Learning for Conditional Image Generation Jul 23, 2019 Conditional Image Generation Continual Learning
— Unverified 00 Lifelong Intent Detection via Multi-Strategy Rebalancing Aug 10, 2021 Intent Detection Knowledge Distillation
— Unverified 00 Life-long Learning for Multilingual Neural Machine Translation with Knowledge Distillation Dec 6, 2022 Knowledge Distillation Machine Translation
— Unverified 00 Lifelong Learning for Neural powered Mixed Integer Programming Aug 24, 2022 Graph Attention Knowledge Distillation
— Unverified 00 Lifelong Learning via Progressive Distillation and Retrospection Sep 1, 2018 Knowledge Distillation Lifelong learning
— Unverified 00 Lifelong Object Detection Sep 2, 2020 Knowledge Distillation Lifelong learning
— Unverified 00 Lifelong Person Search Jul 31, 2024 Knowledge Distillation Person Search
— Unverified 00 Lifelong Twin Generative Adversarial Networks Jul 9, 2021 Knowledge Distillation
— Unverified 00 Lifelong Unsupervised Domain Adaptive Person Re-identification with Coordinated Anti-forgetting and Adaptation Dec 13, 2021 Domain Adaptive Person Re-Identification Knowledge Distillation
— Unverified 00 LightBTSeg: A lightweight breast tumor segmentation model using ultrasound images via dual-path joint knowledge distillation Nov 18, 2023 Knowledge Distillation Lesion Detection
— Unverified 00 Light distillation for Incremental Graph Convolution Collaborative Filtering May 26, 2025 Collaborative Filtering Knowledge Distillation
— Unverified 00 LightPAFF: A Two-Stage Distillation Framework for Pre-training and Fine-tuning Apr 27, 2020 Knowledge Distillation Language Modeling
— Unverified 00 LightVessel: Exploring Lightweight Coronary Artery Vessel Segmentation via Similarity Knowledge Distillation Nov 2, 2022 Decoder Knowledge Distillation
— Unverified 00