Label Semantic Knowledge Distillation for Unbiased Scene Graph Generation Aug 7, 2022 Graph Generation Knowledge Distillation
— Unverified 00 LadaBERT: Lightweight Adaptation of BERT through Hybrid Model Compression Apr 8, 2020 Blocking Knowledge Distillation
— Unverified 00 LaDiMo: Layer-wise Distillation Inspired MoEfier Aug 8, 2024 Knowledge Distillation Mixture-of-Experts
— Unverified 00 LAKD-Activation Mapping Distillation Based on Local Learning Aug 21, 2024 Knowledge Distillation
— Unverified 00 LAMeTA: Intent-Aware Agentic Network Optimization via a Large AI Model-Empowered Two-Stage Approach May 18, 2025 Deep Reinforcement Learning Knowledge Distillation
— Unverified 00 Language Graph Distillation for Low-Resource Machine Translation Aug 17, 2019 Knowledge Distillation Machine Translation
— Unverified 00 Language Modelling via Learning to Rank Oct 13, 2021 Knowledge Distillation Language Modelling
— Unverified 00 Language-Oriented Communication with Semantic Coding and Knowledge Distillation for Text-to-Image Generation Sep 20, 2023 Image Generation In-Context Learning
— Unverified 00 LAPTOP-Diff: Layer Pruning and Normalized Distillation for Compressing Diffusion Models Apr 17, 2024 Knowledge Distillation
— Unverified 00 Just CHOP: Embarrassingly Simple LLM Compression May 24, 2023 Knowledge Distillation Language Modeling
— Unverified 00 Large Language Model Guided Knowledge Distillation for Time Series Anomaly Detection Jan 26, 2024 Anomaly Detection Knowledge Distillation
— Unverified 00 Large Language Model Meets Graph Neural Network in Knowledge Distillation Feb 8, 2024 Contrastive Learning Graph Attention
— Unverified 00 Large Model for Small Data: Foundation Model for Cross-Modal RF Human Activity Recognition Oct 13, 2024 Activity Recognition Few-Shot Learning
— Unverified 00 Large-Scale Generative Data-Free Distillation Dec 10, 2020 Knowledge Distillation Model Compression
— Unverified 00 LaSNN: Layer-wise ANN-to-SNN Distillation for Effective and Efficient Training in Deep Spiking Neural Networks Apr 17, 2023 Knowledge Distillation
— Unverified 00 Layer Attack Unlearning: Fast and Accurate Machine Unlearning via Layer Level Attack and Knowledge Distillation Dec 28, 2023 Knowledge Distillation Machine Unlearning
— Unverified 00 LayerCollapse: Adaptive compression of neural networks Nov 29, 2023 Computational Efficiency image-classification
— Unverified 00 Layer Importance for Mathematical Reasoning is Forged in Pre-Training and Invariant after Post-Training Jun 27, 2025 Knowledge Distillation Mathematical Reasoning
— Unverified 00 Layerwise Bregman Representation Learning with Applications to Knowledge Distillation Sep 15, 2022 Knowledge Distillation Representation Learning
— Unverified 00 Noisy Data Meets Privacy: Training Local Models with Post-Processed Remote Queries May 25, 2024 Knowledge Distillation Model extraction
— Unverified 00 LEAD: Liberal Feature-based Distillation for Dense Retrieval Dec 10, 2022 Document Ranking Knowledge Distillation
— Unverified 00 LEALLA: Learning Lightweight Language-agnostic Sentence Embeddings with Knowledge Distillation Feb 16, 2023 Knowledge Distillation Sentence
— Unverified 00 Learnable Cross-modal Knowledge Distillation for Multi-modal Learning with Missing Modality Oct 2, 2023 Knowledge Distillation
— Unverified 00 Learn from Balance: Rectifying Knowledge Transfer for Long-Tailed Scenarios Sep 12, 2024 Knowledge Distillation Transfer Learning
— Unverified 00 Learn From the Past: Experience Ensemble Knowledge Distillation Feb 25, 2022 Knowledge Distillation Transfer Learning
— Unverified 00 Learning an Augmented RGB Representation with Cross-Modal Knowledge Distillation for Action Detection Aug 8, 2021 Action Detection Knowledge Distillation
— Unverified 00 Learning Background Prompts to Discover Implicit Knowledge for Open Vocabulary Object Detection Jun 1, 2024 Knowledge Distillation Object
— Unverified 00 Learning Bayesian Sparse Networks with Full Experience Replay for Continual Learning Feb 21, 2022 Continual Learning Knowledge Distillation
— Unverified 00 Learning by Distillation: A Self-Supervised Learning Framework for Optical Flow Estimation Jun 8, 2021 Knowledge Distillation Optical Flow Estimation
— Unverified 00 Learning Cross-Lingual IR from an English Retriever Jan 16, 2022 Cross-Lingual Information Retrieval Information Retrieval
— Unverified 00 Diverse Knowledge Distillation (DKD): A Solution for Improving The Robustness of Ensemble Models Against Adversarial Attacks Jun 26, 2020 Ensemble Learning image-classification
— Unverified 00 Learning Efficient Image Super-Resolution Networks via Structure-Regularized Pruning Sep 29, 2021 Image Super-Resolution Knowledge Distillation
— Unverified 00 Learning Efficient Object Detection Models with Knowledge Distillation Dec 1, 2017 Knowledge Distillation Model Compression
— Unverified 00 Learning from a Lightweight Teacher for Efficient Knowledge Distillation May 19, 2020 Knowledge Distillation
— Unverified 00 Learning From Biased Soft Labels Feb 16, 2023 Knowledge Distillation
— Unverified 00 Learning from deep model via exploring local targets Jan 1, 2021 Knowledge Distillation model
— Unverified 00 Learning from Imperfect Data: Towards Efficient Knowledge Distillation of Autoregressive Language Models for Text-to-SQL Oct 15, 2024 Knowledge Distillation Text to SQL
— Unverified 00 Learning from Matured Dumb Teacher for Fine Generalization Aug 12, 2021 image-classification Image Classification
— Unverified 00 Learning Human-Human Interactions in Images from Weak Textual Supervision Apr 27, 2023 Human-Human Interaction Recognition Image Captioning
— Unverified 00 MixMix: All You Need for Data-Free Compression Are Feature and Data Mixing Nov 19, 2020 All Knowledge Distillation
— Unverified 00 Learning Interpretation with Explainable Knowledge Distillation Nov 12, 2021 Knowledge Distillation Model Compression
— Unverified 00 Learning Knowledge Representation with Meta Knowledge Distillation for Single Image Super-Resolution Jul 18, 2022 Image Super-Resolution Knowledge Distillation
— Unverified 00 Learning Lightweight Object Detectors via Multi-Teacher Progressive Distillation Aug 17, 2023 Edge-computing Instance Segmentation
— Unverified 00 Learning Lightweight Pedestrian Detector with Hierarchical Knowledge Distillation Sep 20, 2019 Knowledge Distillation Pedestrian Detection
— Unverified 00 Learning Modality-agnostic Representation for Semantic Segmentation from Any Modalities Jul 16, 2024 Knowledge Distillation Semantic Segmentation
— Unverified 00 Learning Student-Friendly Teacher Networks for Knowledge Distillation Feb 12, 2021 Knowledge Distillation Transfer Learning
— Unverified 00 Learning Student Networks via Feature Embedding Dec 17, 2018 Knowledge Distillation
— Unverified 00 Learning Task-Agnostic Embedding of Multiple Black-Box Experts for Multi-Task Model Fusion Jan 1, 2020 Knowledge Distillation
— Unverified 00 Learning the Wrong Lessons: Inserting Trojans During Knowledge Distillation Mar 9, 2023 Knowledge Distillation
— Unverified 00 Learning Through Guidance: Knowledge Distillation for Endoscopic Image Classification Aug 17, 2023 Classification Feature Engineering
— Unverified 00