Yield Evaluation of Citrus Fruits based on the YoloV5 compressed by Knowledge Distillation Nov 16, 2022 Knowledge Distillation
— Unverified 0YOLO in the Dark - Domain Adaptation Method for Merging Multiple Models - Aug 1, 2020 Domain Adaptation Knowledge Distillation
— Unverified 0You Can Have Your Data and Balance It Too: Towards Balanced and Efficient Multilingual Models Oct 13, 2022 Cross-Lingual Transfer Knowledge Distillation
— Unverified 0You Do Not Need Additional Priors or Regularizers in Retinex-Based Low-Light Image Enhancement Jan 1, 2023 Contrastive Learning Image Enhancement
— Unverified 0Zero shot framework for satellite image restoration Jun 5, 2023 Disentanglement Image Restoration
— Unverified 0Zero-shot Slot Filling in the Age of LLMs for Dialogue Systems Nov 28, 2024 Knowledge Distillation Natural Language Understanding
— Unverified 0Diverse Knowledge Distillation (DKD): A Solution for Improving The Robustness of Ensemble Models Against Adversarial Attacks Jun 26, 2020 Ensemble Learning image-classification
— Unverified 0Learning Efficient Image Super-Resolution Networks via Structure-Regularized Pruning Sep 29, 2021 Image Super-Resolution Knowledge Distillation
— Unverified 0Learning Efficient Object Detection Models with Knowledge Distillation Dec 1, 2017 Knowledge Distillation Model Compression
— Unverified 0Learning from a Lightweight Teacher for Efficient Knowledge Distillation May 19, 2020 Knowledge Distillation
— Unverified 0Learning From Biased Soft Labels Feb 16, 2023 Knowledge Distillation
— Unverified 0Learning from deep model via exploring local targets Jan 1, 2021 Knowledge Distillation model
— Unverified 0Learning from Imperfect Data: Towards Efficient Knowledge Distillation of Autoregressive Language Models for Text-to-SQL Oct 15, 2024 Knowledge Distillation Text to SQL
— Unverified 0Learning from Matured Dumb Teacher for Fine Generalization Aug 12, 2021 image-classification Image Classification
— Unverified 0Learning Human-Human Interactions in Images from Weak Textual Supervision Apr 27, 2023 Human-Human Interaction Recognition Image Captioning
— Unverified 0MixMix: All You Need for Data-Free Compression Are Feature and Data Mixing Nov 19, 2020 All Knowledge Distillation
— Unverified 0Learning Interpretation with Explainable Knowledge Distillation Nov 12, 2021 Knowledge Distillation Model Compression
— Unverified 0Learning Knowledge Representation with Meta Knowledge Distillation for Single Image Super-Resolution Jul 18, 2022 Image Super-Resolution Knowledge Distillation
— Unverified 0Learning Lightweight Object Detectors via Multi-Teacher Progressive Distillation Aug 17, 2023 Edge-computing Instance Segmentation
— Unverified 0Learning Lightweight Pedestrian Detector with Hierarchical Knowledge Distillation Sep 20, 2019 Knowledge Distillation Pedestrian Detection
— Unverified 0Learning Modality-agnostic Representation for Semantic Segmentation from Any Modalities Jul 16, 2024 Knowledge Distillation Semantic Segmentation
— Unverified 0Learning Student-Friendly Teacher Networks for Knowledge Distillation Feb 12, 2021 Knowledge Distillation Transfer Learning
— Unverified 0Learning Student Networks via Feature Embedding Dec 17, 2018 Knowledge Distillation
— Unverified 0Learning Task-Agnostic Embedding of Multiple Black-Box Experts for Multi-Task Model Fusion Jan 1, 2020 Knowledge Distillation
— Unverified 0Learning the Wrong Lessons: Inserting Trojans During Knowledge Distillation Mar 9, 2023 Knowledge Distillation
— Unverified 0Learning Through Guidance: Knowledge Distillation for Endoscopic Image Classification Aug 17, 2023 Classification Feature Engineering
— Unverified 0Learning to Augment for Data-Scarce Domain BERT Knowledge Distillation Jan 20, 2021 Knowledge Distillation
— Unverified 0Learning to Extract Attribute Value from Product via Question Answering: A Multi-task Approach Aug 20, 2020 Attribute Attribute Value Extraction
— Unverified 0Learning to Project for Cross-Task Knowledge Distillation Mar 21, 2024 Depth Estimation Knowledge Distillation
— Unverified 0Learning to reconstruct signals with inexact sensing operator via knowledge distillation Jan 18, 2025 Knowledge Distillation
— Unverified 0Learning to Retain while Acquiring: Combating Distribution-Shift in Adversarial Data-Free Knowledge Distillation Feb 28, 2023 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 0Learning to Specialize with Knowledge Distillation for Visual Question Answering Dec 1, 2018 General Classification General Knowledge
— Unverified 0Learning to Teach with Student Feedback Sep 10, 2021 Knowledge Distillation
— Unverified 0Learning to Teach with Student Feedback Nov 16, 2021 Knowledge Distillation
— Unverified 0Learning ULMFiT and Self-Distillation with Calibration for Medical Dialogue System Jul 20, 2021 Decision Making Knowledge Distillation
— Unverified 0Learning Using Generated Privileged Information by Text-to-Image Diffusion Models Sep 26, 2023 Classification Knowledge Distillation
— Unverified 0Teaching What You Should Teach: A Data-Based Distillation Method Dec 11, 2022 Data Augmentation Knowledge Distillation
— Unverified 0Learning with Less: Knowledge Distillation from Large Language Models via Unlabeled Data Nov 12, 2024 Knowledge Distillation
— Unverified 0Learn Spelling from Teachers: Transferring Knowledge from Language Models to Sequence-to-Sequence Speech Recognition Jul 13, 2019 Knowledge Distillation Language Modeling
— Unverified 0Learn to Talk via Proactive Knowledge Transfer Aug 23, 2020 de-en Knowledge Distillation
— Unverified 0Leave No Knowledge Behind During Knowledge Distillation: Towards Practical and Effective Knowledge Distillation for Code-Switching ASR Using Realistic Data Jul 15, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Legal-Tech Open Diaries: Lesson learned on how to develop and deploy light-weight models in the era of humongous Language Models Oct 24, 2022 Knowledge Distillation Model Compression
— Unverified 0LegoDNN: Block-grained Scaling of Deep Neural Networks for Mobile Vision Dec 18, 2021 Knowledge Distillation Model Compression
— Unverified 0LENS-XAI: Redefining Lightweight and Explainable Network Security through Knowledge Distillation and Variational Autoencoders for Scalable Intrusion Detection in Cybersecurity Jan 1, 2025 Computational Efficiency Intrusion Detection
— Unverified 0Less is More: Efficient Brain-Inspired Learning for Autonomous Driving Trajectory Prediction Jul 9, 2024 Autonomous Driving Decision Making
— Unverified 0Less or More From Teacher: Exploiting Trilateral Geometry For Knowledge Distillation Dec 22, 2023 Bilevel Optimization Click-Through Rate Prediction
— Unverified 0Let Video Teaches You More: Video-to-Image Knowledge Distillation using DEtection TRansformer for Medical Video Lesion Detection Aug 26, 2024 Knowledge Distillation Lesion Detection
— Unverified 0Letz Translate: Low-Resource Machine Translation for Luxembourgish Mar 2, 2023 Knowledge Distillation Machine Translation
— Unverified 0Leukocyte Classification using Multimodal Architecture Enhanced by Knowledge Distillation Aug 17, 2022 Classification Knowledge Distillation
— Unverified 0Leveraging Acoustic and Linguistic Embeddings from Pretrained speech and language Models for Intent Classification Feb 15, 2021 Classification General Classification
— Unverified 0