Efficient Image Compression Using Advanced State Space Models Sep 4, 2024 Computational Efficiency Image Compression
— Unverified 0Efficient Inference via Universal LSH Kernel Jun 21, 2021 Knowledge Distillation Quantization
— Unverified 0Efficient Intent-Based Filtering for Multi-Party Conversations Using Knowledge Distillation from LLMs Mar 21, 2025 intent-classification Intent Classification
— Unverified 0Efficient Knowledge Distillation: Empowering Small Language Models with Teacher Model Insights Sep 19, 2024 Decision Making Knowledge Distillation
— Unverified 0Efficient Knowledge Distillation of SAM for Medical Image Segmentation Jan 28, 2025 Computational Efficiency Decoder
— Unverified 0Efficient Knowledge Distillation via Curriculum Extraction Mar 21, 2025 Knowledge Distillation Language Modeling
— Unverified 0Efficient Machine Translation with Model Pruning and Quantization Nov 1, 2021 CPU Decoder
— Unverified 0Efficient Object Detection in Optical Remote Sensing Imagery via Attention-based Feature Distillation Oct 28, 2023 Knowledge Distillation Object
— Unverified 0Efficient Open-world Reinforcement Learning via Knowledge Distillation and Autonomous Rule Discovery Nov 24, 2023 Deep Reinforcement Learning Knowledge Distillation
— Unverified 0Efficient Point Cloud Classification via Offline Distillation Framework and Negative-Weight Self-Distillation Technique Sep 3, 2024 Data Augmentation Knowledge Distillation
— Unverified 0Efficient Speech Command Recognition Leveraging Spiking Neural Network and Curriculum Learning-based Knowledge Distillation Dec 17, 2024 Edge-computing Knowledge Distillation
— Unverified 0Efficient speech detection in environmental audio using acoustic recognition and knowledge distillation Dec 14, 2023 Knowledge Distillation Model Selection
— Unverified 0Efficient Technical Term Translation: A Knowledge Distillation Approach for Parenthetical Terminology Translation Oct 1, 2024 Knowledge Distillation Machine Translation
— Unverified 0Efficient training of lightweight neural networks using Online Self-Acquired Knowledge Distillation Aug 26, 2021 Density Estimation Knowledge Distillation
— Unverified 0Efficient Transformer-based Large Scale Language Representations using Hardware-friendly Block Structured Pruning Sep 17, 2020 Edge-computing Knowledge Distillation
— Unverified 0Efficient Transformer Knowledge Distillation: A Performance Review Nov 22, 2023 Knowledge Distillation Model Compression
— Unverified 0Efficient Verified Machine Unlearning For Distillation Mar 28, 2025 Knowledge Distillation Machine Unlearning
— Unverified 0Efficient Video Segmentation Models with Per-frame Inference Feb 24, 2022 Image Matting Instance Segmentation
— Unverified 0EfficientViT-SAM: Accelerated Segment Anything Model Without Accuracy Loss Feb 7, 2024 Decoder GPU
— Unverified 0ESGN: Efficient Stereo Geometry Network for Fast 3D Object Detection Nov 28, 2021 3D Object Detection Knowledge Distillation
— Unverified 0IOR: Inversed Objects Replay for Incremental Object Detection Jun 7, 2024 Knowledge Distillation Object
— Unverified 0EI-MTD:Moving Target Defense for Edge Intelligence against Adversarial Attacks Sep 19, 2020 Knowledge Distillation Scheduling
— Unverified 0ELAD: Explanation-Guided Large Language Models Active Distillation Feb 20, 2024 Active Learning Knowledge Distillation
— Unverified 0ELAICHI: Enhancing Low-resource TTS by Addressing Infrequent and Low-frequency Character Bigrams Oct 23, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0ELiTe: Efficient Image-to-LiDAR Knowledge Transfer for Semantic Segmentation May 7, 2024 Knowledge Distillation LIDAR Semantic Segmentation
— Unverified 0Embedded Knowledge Distillation in Depth-Level Dynamic Neural Network Mar 1, 2021 Dynamic neural networks Knowledge Distillation
— Unverified 0Embedding Compression for Teacher-to-Student Knowledge Transfer Feb 9, 2024 Knowledge Distillation Transfer Learning
— Unverified 0EmbedDistill: A Geometric Knowledge Distillation for Information Retrieval Jan 27, 2023 Information Retrieval Knowledge Distillation
— Unverified 0Embracing the Dark Knowledge: Domain Generalization Using Regularized Knowledge Distillation Jul 6, 2021 Domain Generalization image-classification
— Unverified 0Emo Pillars: Knowledge Distillation to Support Fine-Grained Context-Aware and Context-Less Emotion Classification Apr 23, 2025 Emotion Classification GPU
— Unverified 0Knowledge distillation for optimization of quantized deep neural networks Sep 4, 2019 Knowledge Distillation
— Unverified 0Empirical Evaluation of Knowledge Distillation from Transformers to Subquadratic Language Models Apr 19, 2025 Knowledge Distillation State Space Models
— Unverified 0Empowering Dual-Encoder with Query Generator for Cross-Lingual Dense Retrieval Mar 27, 2023 Knowledge Distillation Retrieval
— Unverified 0Empowering Knowledge Distillation via Open Set Recognition for Robust 3D Point Cloud Classification Oct 25, 2020 3D Point Cloud Classification General Classification
— Unverified 0Enabling Multimodal Generation on CLIP via Vision-Language Knowledge Distillation Nov 16, 2021 Image Captioning Knowledge Distillation
— Unverified 0Enabling Multimodal Generation on CLIP via Vision-Language Knowledge Distillation Mar 12, 2022 Image Captioning Knowledge Distillation
— Unverified 0Enabling Weak Client Participation via On-device Knowledge Distillation in Heterogenous Federated Learning Mar 14, 2025 Federated Learning Knowledge Distillation
— Unverified 0EncodeNet: A Framework for Boosting DNN Accuracy with Entropy-driven Generalized Converting Autoencoder Apr 21, 2024 image-classification Image Classification
— Unverified 0Endpoints Weight Fusion for Class Incremental Semantic Segmentation Jan 1, 2023 class-incremental learning Class Incremental Learning
— Unverified 0End-to-End Automatic Speech Recognition with Deep Mutual Learning Feb 16, 2021 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0End-to-end fully-binarized network design: from Generic Learned Thermometer to Block Pruning May 5, 2025 Knowledge Distillation Quantization
— Unverified 0End-to-End Simultaneous Speech Translation with Pretraining and Distillation: Huawei Noah’s System for AutoSimTranS 2022 Jul 1, 2022 Decoder Knowledge Distillation
— Unverified 0End-to-End Speech Translation with Knowledge Distillation Apr 17, 2019 Knowledge Distillation speech-recognition
— Unverified 0End-to-End Speech-Translation with Knowledge Distillation: FBK@IWSLT2020 Jun 4, 2020 Data Augmentation Knowledge Distillation
— Unverified 0Energy-efficient Knowledge Distillation for Spiking Neural Networks Jun 14, 2021 Knowledge Distillation Model Compression
— Unverified 0Enhanced Multimodal Representation Learning with Cross-modal KD Jun 13, 2023 Contrastive Learning Emotion Classification
— Unverified 0Enhanced Sparsification via Stimulative Training Mar 11, 2024 Knowledge Distillation Model Compression
— Unverified 0Enhancing Abstractiveness of Summarization Models through Calibrated Distillation Oct 20, 2023 Abstractive Text Summarization Informativeness
— Unverified 0Enhancing Accuracy and Parameter-Efficiency of Neural Representations for Network Parameterization Jun 29, 2024 Knowledge Distillation
— Unverified 0Enhancing Action Recognition from Low-Quality Skeleton Data via Part-Level Knowledge Distillation Apr 28, 2024 Action Recognition General Knowledge
— Unverified 0