Enhancing Adversarial Training with Prior Knowledge Distillation for Robust Image Compression Mar 11, 2024 Backdoor Attack Image Compression
— Unverified 0Enhancing Chinese Multi-Label Text Classification Performance with Response-based Knowledge Distillation Nov 1, 2022 Knowledge Distillation Multi Label Text Classification
— Unverified 0Enhancing Content Representation for AR Image Quality Assessment Using Knowledge Distillation Dec 8, 2024 Image Quality Assessment Knowledge Distillation
— Unverified 0Enhancing CTC-Based Visual Speech Recognition Sep 11, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Enhancing Data-Free Adversarial Distillation with Activation Regularization and Virtual Interpolation Feb 23, 2021 Knowledge Distillation
— Unverified 0Enhancing Few-shot Keyword Spotting Performance through Pre-Trained Self-supervised Speech Models Jun 21, 2025 Dimensionality Reduction Keyword Spotting
— Unverified 0Enhancing Generalization in Chain of Thought Reasoning for Smaller Models Jan 16, 2025 Knowledge Distillation Memorization
— Unverified 0Enhancing Mapless Trajectory Prediction through Knowledge Distillation Jun 25, 2023 Autonomous Driving Knowledge Distillation
— Unverified 0Enhancing Modality-Agnostic Representations via Meta-Learning for Brain Tumor Segmentation Feb 8, 2023 Brain Tumor Segmentation Image Generation
— Unverified 0Enhancing Once-For-All: A Study on Parallel Blocks, Skip Connections and Early Exits Feb 3, 2023 All Knowledge Distillation
— Unverified 0Enhancing Review Comprehension with Domain-Specific Commonsense Apr 6, 2020 Aspect Extraction Knowledge Distillation
— Unverified 0Enhancing Romanian Offensive Language Detection through Knowledge Distillation, Multi-Task Learning, and Data Augmentation Sep 30, 2024 Data Augmentation Knowledge Distillation
— Unverified 0Enhancing Scalability in Recommender Systems through Lottery Ticket Hypothesis and Knowledge Distillation-based Neural Network Pruning Jan 19, 2024 GPU Knowledge Distillation
— Unverified 0Enhancing Semi-supervised Learning with Zero-shot Pseudolabels Feb 18, 2025 Knowledge Distillation
— Unverified 0Enhancing Single-Slice Segmentation with 3D-to-2D Unpaired Scan Distillation Jun 18, 2024 Computed Tomography (CT) Knowledge Distillation
— Unverified 0Enhancing SLM via ChatGPT and Dataset Augmentation Sep 19, 2024 Knowledge Distillation Natural Language Inference
— Unverified 0Enhancing Systematic Decompositional Natural Language Inference Using Informal Logic Feb 22, 2024 Formal Logic Knowledge Distillation
— Unverified 0Ensemble Knowledge Distillation for CTR Prediction Nov 8, 2020 Click-Through Rate Prediction Knowledge Distillation
— Unverified 0Ensemble Distillation for Neural Machine Translation Feb 6, 2017 Knowledge Distillation Machine Translation
— Unverified 0Ensemble Knowledge Distillation for Machine Learning Interatomic Potentials Mar 18, 2025 Atomic Forces Knowledge Distillation
— Unverified 0Ensemble knowledge distillation of self-supervised speech models Feb 24, 2023 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Ensembling of Distilled Models from Multi-task Teachers for Constrained Resource Language Pairs Nov 26, 2021 Knowledge Distillation Translation
— Unverified 0EnSiam: Self-Supervised Learning With Ensemble Representations May 22, 2023 Contrastive Learning Knowledge Distillation
— Unverified 0Entire-Space Variational Information Exploitation for Post-Click Conversion Rate Prediction Dec 17, 2024 Knowledge Distillation Recommendation Systems
— Unverified 0EPIK: Eliminating multi-model Pipelines with Knowledge-distillation Nov 27, 2022 Knowledge Distillation Transliteration
— Unverified 0EPSD: Early Pruning with Self-Distillation for Efficient Model Compression Jan 31, 2024 Knowledge Distillation Model Compression
— Unverified 0ERNIE-Search: Bridging Cross-Encoder with Dual-Encoder via Self On-the-fly Distillation for Dense Passage Retrieval May 18, 2022 Knowledge Distillation Open-Domain Question Answering
— Unverified 0Error Exponent in Agnostic PAC Learning May 1, 2024 Binary Classification Knowledge Distillation
— Unverified 0ESLM: Risk-Averse Selective Language Modeling for Efficient Pretraining May 26, 2025 Knowledge Distillation Language Modeling
— Unverified 0ESPnet How2 Speech Translation System for IWSLT 2019: Pre-training, Knowledge Distillation, and Going Deeper Nov 1, 2019 All Knowledge Distillation
— Unverified 0ESPnet-ST IWSLT 2021 Offline Speech Translation System Jul 1, 2021 Decoder Knowledge Distillation
— Unverified 0Essence Knowledge Distillation for Speech Recognition Jun 26, 2019 Knowledge Distillation speech-recognition
— Unverified 0Estimating and Maximizing Mutual Information for Knowledge Distillation Oct 29, 2021 Knowledge Distillation
— Unverified 0Estimating Human Poses Across Datasets: A Unified Skeleton and Multi-Teacher Distillation Approach May 30, 2024 Activity Recognition Knowledge Distillation
— Unverified 0Evaluation-oriented Knowledge Distillation for Deep Face Recognition Jun 6, 2022 Face Recognition Knowledge Distillation
— Unverified 0Ever Evolving Evaluator (EV3): Towards Flexible and Reliable Meta-Optimization for Knowledge Distillation Oct 29, 2023 Diversity Evolutionary Algorithms
— Unverified 0Every Expert Matters: Towards Effective Knowledge Distillation for Mixture-of-Experts Language Models Feb 18, 2025 Knowledge Distillation Mixture-of-Experts
— Unverified 0Evidential Federated Learning for Skin Lesion Image Classification Nov 15, 2024 Classification Federated Learning
— Unverified 0EVOKE: Emotion Enabled Virtual Avatar Mapping Using Optimized Knowledge Distillation Jan 13, 2024 Emotion Recognition Knowledge Distillation
— Unverified 0Evolving Knowledge Distillation with Large Language Models and Active Learning Mar 11, 2024 Active Learning Knowledge Distillation
— Unverified 0Evolving Storytelling: Benchmarks and Methods for New Character Customization with Diffusion Models May 20, 2024 Knowledge Distillation Story Generation
— Unverified 0Examining the Mapping Functions of Denoising Autoencoders in Singing Voice Separation Apr 12, 2019 Decoder Denoising
— Unverified 0Exclusivity-Consistency Regularized Knowledge Distillation for Face Recognition Aug 1, 2020 Diversity Face Recognition
— Unverified 0Expanding Deep Learning-based Sensing Systems with Multi-Source Knowledge Transfer Dec 5, 2024 Deep Learning Knowledge Distillation
— Unverified 0ExpandNets: Linear Over-parameterization to Train Compact Convolutional Networks Nov 26, 2018 General Classification image-classification
— Unverified 0Expediting Contrastive Language-Image Pretraining via Self-distilled Encoders Dec 19, 2023 Knowledge Distillation
— Unverified 0Experimentation in Content Moderation using RWKV Sep 5, 2024 CPU Knowledge Distillation
— Unverified 0Experimenting with Knowledge Distillation techniques for performing Brain Tumor Segmentation May 24, 2021 Brain Tumor Segmentation Knowledge Distillation
— Unverified 0Explainability-Driven Leaf Disease Classification Using Adversarial Training and Knowledge Distillation Dec 30, 2023 Adversarial Attack Classification
— Unverified 0Explainable Knowledge Distillation for On-device Chest X-Ray Classification May 10, 2023 Explainable artificial intelligence Explainable Artificial Intelligence (XAI)
— Unverified 0