Explainable LLM-driven Multi-dimensional Distillation for E-Commerce Relevance Learning Nov 20, 2024 Knowledge Distillation Large Language Model
— Unverified 0Explaining Knowledge Distillation by Quantifying the Knowledge Mar 7, 2020 Knowledge Distillation
— Unverified 0Explaining Knowledge Graph Embedding via Latent Rule Learning Sep 29, 2021 Graph Embedding Knowledge Distillation
— Unverified 0Explaining Sequence-Level Knowledge Distillation as Data-Augmentation for Neural Machine Translation Dec 6, 2019 Data Augmentation Knowledge Distillation
— Unverified 0Explicit and Implicit Knowledge Distillation via Unlabeled Data Feb 17, 2023 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 0Explicit Connection Distillation Jan 1, 2021 image-classification Image Classification
— Unverified 0Explicit Knowledge Transfer for Weakly-Supervised Code Generation Nov 30, 2022 Code Generation Few-Shot Learning
— Unverified 0Exploiting Knowledge Distillation for Few-Shot Image Generation Sep 29, 2021 Diversity Image Generation
— Unverified 0Exploiting Unlabelled Photos for Stronger Fine-Grained SBIR Mar 24, 2023 Image Retrieval Knowledge Distillation
— Unverified 0Exploring and Enhancing the Transfer of Distribution in Knowledge Distillation for Autoregressive Language Models Sep 19, 2024 Knowledge Distillation
— Unverified 0Exploring compressibility of transformer based text-to-music (TTM) models Jun 24, 2024 Decoder FAD
— Unverified 0Exploring Dark Knowledge under Various Teacher Capacities and Addressing Capacity Mismatch May 21, 2024 Knowledge Distillation
— Unverified 0Exploring Dual Model Knowledge Distillation for Anomaly Detection Jun 27, 2023 Anomaly Detection feature selection
— Unverified 0Exploring Extreme Quantization in Spiking Language Models May 4, 2024 Knowledge Distillation Language Modeling
— Unverified 0Exploring Knowledge Distillation of a Deep Neural Network for Multi-Script identification Feb 20, 2021 Knowledge Distillation Transfer Learning
— Unverified 0Fully Synthetic Data Improves Neural Machine Translation with Knowledge Distillation Dec 31, 2020 Knowledge Distillation Machine Translation
— Unverified 0Exploring Multi-Modal Contextual Knowledge for Open-Vocabulary Object Detection Aug 30, 2023 Knowledge Distillation Language Modeling
— Unverified 0Exploring Self- and Cross-Triplet Correlations for Human-Object Interaction Detection Jan 11, 2024 Human-Object Interaction Detection Knowledge Distillation
— Unverified 0A Note on Knowledge Distillation Loss Function for Object Classification Sep 14, 2021 Knowledge Distillation Model Compression
— Unverified 0Exploring the Limits of Simple Learners in Knowledge Distillation for Document Classification with DocBERT Jul 1, 2020 Document Classification General Classification
— Unverified 0Extending Label Smoothing Regularization with Self-Knowledge Distillation Sep 11, 2020 Knowledge Distillation Self-Knowledge Distillation
— Unverified 0Extracting General-use Transformers for Low-resource Languages via Knowledge Distillation Jan 22, 2025 Knowledge Distillation
— Unverified 0Extracting knowledge from features with multilevel abstraction Dec 4, 2021 Data Augmentation Knowledge Distillation
— Unverified 0Extract then Distill: Efficient and Effective Task-Agnostic BERT Distillation Apr 24, 2021 Knowledge Distillation
— Unverified 0Extracurricular Learning: Knowledge Transfer Beyond Empirical Distribution Jun 30, 2020 Image Classification Knowledge Distillation
— Unverified 0Extreme Compression for Pre-trained Transformers Made Simple and Efficient Jun 4, 2022 Knowledge Distillation Quantization
— Unverified 0Extreme compression of sentence-transformer ranker models: faster inference, longer battery life, and less storage on edge devices Jun 29, 2022 Dimensionality Reduction Knowledge Distillation
— Unverified 0Extremely Small BERT Models from Mixed-Vocabulary Training Sep 25, 2019 Knowledge Distillation Language Modelling
— Unverified 0Face to Cartoon Incremental Super-Resolution using Knowledge Distillation Jan 27, 2024 Hallucination Incremental Learning
— Unverified 0Factorized Distillation: Training Holistic Person Re-identification Model by Distilling an Ensemble of Partial ReID Models Nov 20, 2018 Knowledge Distillation Person Re-Identification
— Unverified 0Factorized RVQ-GAN For Disentangled Speech Tokenization Jun 18, 2025 Disentanglement Knowledge Distillation
— Unverified 0Factual Dialogue Summarization via Learning from Large Language Models Jun 20, 2024 Contrastive Learning Data Augmentation
— Unverified 0Selective Cross-Task Distillation Apr 25, 2022 Knowledge Distillation
— Unverified 0Failure-Resilient Distributed Inference with Model Compression over Heterogeneous Edge Devices Jun 20, 2024 Knowledge Distillation Model Compression
— Unverified 0Fair Feature Distillation for Visual Recognition May 27, 2021 Fairness Knowledge Distillation
— Unverified 0Fair Feature Importance Scores for Interpreting Tree-Based Methods and Surrogates Oct 6, 2023 Fairness Feature Importance
— Unverified 0Fairly Predicting Graft Failure in Liver Transplant for Organ Assigning Feb 18, 2023 Fairness Knowledge Distillation
— Unverified 0Fairness Continual Learning Approach to Semantic Scene Understanding in Open-World Environments May 25, 2023 Continual Learning Continual Semantic Segmentation
— Unverified 0Fair Text to Medical Image Diffusion Model with Subgroup Distribution Aligned Tuning Jun 21, 2024 Knowledge Distillation
— Unverified 0Faithful Knowledge Distillation Jun 7, 2023 Adversarial Robustness Knowledge Distillation
— Unverified 0Fake It Till Make It: Federated Learning with Consensus-Oriented Generation Dec 10, 2023 Federated Learning Knowledge Distillation
— Unverified 0Fall Detection using Knowledge Distillation Based Long short-term memory for Offline Embedded and Low Power Devices Aug 24, 2023 Knowledge Distillation Time Series
— Unverified 0False Negative Distillation and Contrastive Learning for Personalized Outfit Recommendation Oct 13, 2021 Contrastive Learning Data Augmentation
— Unverified 0FAN-Trans: Online Knowledge Distillation for Facial Action Unit Detection Nov 11, 2022 Action Unit Detection Face Alignment
— Unverified 0Fast and Efficient Once-For-All Networks for Diverse Hardware Deployment Sep 29, 2021 All GPU
— Unverified 0Fast and High-Performance Learned Image Compression With Improved Checkerboard Context Model, Deformable Residual Module, and Knowledge Distillation Sep 5, 2023 Image Compression Knowledge Distillation
— Unverified 0Fast DistilBERT on CPUs Oct 27, 2022 Knowledge Distillation Model Compression
— Unverified 0Fast End-to-end Coreference Resolution for Korean Nov 1, 2020 coreference-resolution Coreference Resolution
— Unverified 0FasterAI: A Lightweight Library for Creating Sparse Neural Networks Jul 3, 2022 Knowledge Distillation
— Unverified 0Faster Inference of Integer SWIN Transformer by Removing the GELU Activation Feb 2, 2024 GPU image-classification
— Unverified 0