Enhancing Systematic Decompositional Natural Language Inference Using Informal Logic Feb 22, 2024 Formal Logic Knowledge Distillation
— Unverified 00 Ensemble Knowledge Distillation for CTR Prediction Nov 8, 2020 Click-Through Rate Prediction Knowledge Distillation
— Unverified 00 Ensemble Distillation for Neural Machine Translation Feb 6, 2017 Knowledge Distillation Machine Translation
— Unverified 00 Ensemble Knowledge Distillation for Machine Learning Interatomic Potentials Mar 18, 2025 Atomic Forces Knowledge Distillation
— Unverified 00 Ensemble knowledge distillation of self-supervised speech models Feb 24, 2023 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 Ensembling of Distilled Models from Multi-task Teachers for Constrained Resource Language Pairs Nov 26, 2021 Knowledge Distillation Translation
— Unverified 00 EnSiam: Self-Supervised Learning With Ensemble Representations May 22, 2023 Contrastive Learning Knowledge Distillation
— Unverified 00 Entire-Space Variational Information Exploitation for Post-Click Conversion Rate Prediction Dec 17, 2024 Knowledge Distillation Recommendation Systems
— Unverified 00 EPIK: Eliminating multi-model Pipelines with Knowledge-distillation Nov 27, 2022 Knowledge Distillation Transliteration
— Unverified 00 EPSD: Early Pruning with Self-Distillation for Efficient Model Compression Jan 31, 2024 Knowledge Distillation Model Compression
— Unverified 00 ERNIE-Search: Bridging Cross-Encoder with Dual-Encoder via Self On-the-fly Distillation for Dense Passage Retrieval May 18, 2022 Knowledge Distillation Open-Domain Question Answering
— Unverified 00 Error Exponent in Agnostic PAC Learning May 1, 2024 Binary Classification Knowledge Distillation
— Unverified 00 ESLM: Risk-Averse Selective Language Modeling for Efficient Pretraining May 26, 2025 Knowledge Distillation Language Modeling
— Unverified 00 ESPnet How2 Speech Translation System for IWSLT 2019: Pre-training, Knowledge Distillation, and Going Deeper Nov 1, 2019 All Knowledge Distillation
— Unverified 00 ESPnet-ST IWSLT 2021 Offline Speech Translation System Jul 1, 2021 Decoder Knowledge Distillation
— Unverified 00 Essence Knowledge Distillation for Speech Recognition Jun 26, 2019 Knowledge Distillation speech-recognition
— Unverified 00 Estimating and Maximizing Mutual Information for Knowledge Distillation Oct 29, 2021 Knowledge Distillation
— Unverified 00 Estimating Human Poses Across Datasets: A Unified Skeleton and Multi-Teacher Distillation Approach May 30, 2024 Activity Recognition Knowledge Distillation
— Unverified 00 Evaluation-oriented Knowledge Distillation for Deep Face Recognition Jun 6, 2022 Face Recognition Knowledge Distillation
— Unverified 00 Ever Evolving Evaluator (EV3): Towards Flexible and Reliable Meta-Optimization for Knowledge Distillation Oct 29, 2023 Diversity Evolutionary Algorithms
— Unverified 00 Every Expert Matters: Towards Effective Knowledge Distillation for Mixture-of-Experts Language Models Feb 18, 2025 Knowledge Distillation Mixture-of-Experts
— Unverified 00 Evidential Federated Learning for Skin Lesion Image Classification Nov 15, 2024 Classification Federated Learning
— Unverified 00 EVOKE: Emotion Enabled Virtual Avatar Mapping Using Optimized Knowledge Distillation Jan 13, 2024 Emotion Recognition Knowledge Distillation
— Unverified 00 Evolving Knowledge Distillation with Large Language Models and Active Learning Mar 11, 2024 Active Learning Knowledge Distillation
— Unverified 00 Evolving Storytelling: Benchmarks and Methods for New Character Customization with Diffusion Models May 20, 2024 Knowledge Distillation Story Generation
— Unverified 00 Examining the Mapping Functions of Denoising Autoencoders in Singing Voice Separation Apr 12, 2019 Decoder Denoising
— Unverified 00 Exclusivity-Consistency Regularized Knowledge Distillation for Face Recognition Aug 1, 2020 Diversity Face Recognition
— Unverified 00 Expanding Deep Learning-based Sensing Systems with Multi-Source Knowledge Transfer Dec 5, 2024 Deep Learning Knowledge Distillation
— Unverified 00 ExpandNets: Linear Over-parameterization to Train Compact Convolutional Networks Nov 26, 2018 General Classification image-classification
— Unverified 00 Expediting Contrastive Language-Image Pretraining via Self-distilled Encoders Dec 19, 2023 Knowledge Distillation
— Unverified 00 Experimentation in Content Moderation using RWKV Sep 5, 2024 CPU Knowledge Distillation
— Unverified 00 Experimenting with Knowledge Distillation techniques for performing Brain Tumor Segmentation May 24, 2021 Brain Tumor Segmentation Knowledge Distillation
— Unverified 00 Explainability-Driven Leaf Disease Classification Using Adversarial Training and Knowledge Distillation Dec 30, 2023 Adversarial Attack Classification
— Unverified 00 Explainable Knowledge Distillation for On-device Chest X-Ray Classification May 10, 2023 Explainable artificial intelligence Explainable Artificial Intelligence (XAI)
— Unverified 00 Explainable LLM-driven Multi-dimensional Distillation for E-Commerce Relevance Learning Nov 20, 2024 Knowledge Distillation Large Language Model
— Unverified 00 Explaining Knowledge Distillation by Quantifying the Knowledge Mar 7, 2020 Knowledge Distillation
— Unverified 00 Explaining Knowledge Graph Embedding via Latent Rule Learning Sep 29, 2021 Graph Embedding Knowledge Distillation
— Unverified 00 Explaining Sequence-Level Knowledge Distillation as Data-Augmentation for Neural Machine Translation Dec 6, 2019 Data Augmentation Knowledge Distillation
— Unverified 00 Explicit and Implicit Knowledge Distillation via Unlabeled Data Feb 17, 2023 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 00 Explicit Connection Distillation Jan 1, 2021 image-classification Image Classification
— Unverified 00 Explicit Knowledge Transfer for Weakly-Supervised Code Generation Nov 30, 2022 Code Generation Few-Shot Learning
— Unverified 00 Exploiting Knowledge Distillation for Few-Shot Image Generation Sep 29, 2021 Diversity Image Generation
— Unverified 00 Exploiting Unlabelled Photos for Stronger Fine-Grained SBIR Mar 24, 2023 Image Retrieval Knowledge Distillation
— Unverified 00 Exploring and Enhancing the Transfer of Distribution in Knowledge Distillation for Autoregressive Language Models Sep 19, 2024 Knowledge Distillation
— Unverified 00 Exploring compressibility of transformer based text-to-music (TTM) models Jun 24, 2024 Decoder FAD
— Unverified 00 Exploring Dark Knowledge under Various Teacher Capacities and Addressing Capacity Mismatch May 21, 2024 Knowledge Distillation
— Unverified 00 Exploring Dual Model Knowledge Distillation for Anomaly Detection Jun 27, 2023 Anomaly Detection feature selection
— Unverified 00 Exploring Extreme Quantization in Spiking Language Models May 4, 2024 Knowledge Distillation Language Modeling
— Unverified 00 Exploring Knowledge Distillation of a Deep Neural Network for Multi-Script identification Feb 20, 2021 Knowledge Distillation Transfer Learning
— Unverified 00 Fully Synthetic Data Improves Neural Machine Translation with Knowledge Distillation Dec 31, 2020 Knowledge Distillation Machine Translation
— Unverified 00