Efficient Verified Machine Unlearning For Distillation Mar 28, 2025 Knowledge Distillation Machine Unlearning
— Unverified 00 Efficient Video Segmentation Models with Per-frame Inference Feb 24, 2022 Image Matting Instance Segmentation
— Unverified 00 EfficientViT-SAM: Accelerated Segment Anything Model Without Accuracy Loss Feb 7, 2024 Decoder GPU
— Unverified 00 ESGN: Efficient Stereo Geometry Network for Fast 3D Object Detection Nov 28, 2021 3D Object Detection Knowledge Distillation
— Unverified 00 IOR: Inversed Objects Replay for Incremental Object Detection Jun 7, 2024 Knowledge Distillation Object
— Unverified 00 EI-MTD:Moving Target Defense for Edge Intelligence against Adversarial Attacks Sep 19, 2020 Knowledge Distillation Scheduling
— Unverified 00 ELAD: Explanation-Guided Large Language Models Active Distillation Feb 20, 2024 Active Learning Knowledge Distillation
— Unverified 00 ELAICHI: Enhancing Low-resource TTS by Addressing Infrequent and Low-frequency Character Bigrams Oct 23, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 ELiTe: Efficient Image-to-LiDAR Knowledge Transfer for Semantic Segmentation May 7, 2024 Knowledge Distillation LIDAR Semantic Segmentation
— Unverified 00 Embedded Knowledge Distillation in Depth-Level Dynamic Neural Network Mar 1, 2021 Dynamic neural networks Knowledge Distillation
— Unverified 00 Embedding Compression for Teacher-to-Student Knowledge Transfer Feb 9, 2024 Knowledge Distillation Transfer Learning
— Unverified 00 EmbedDistill: A Geometric Knowledge Distillation for Information Retrieval Jan 27, 2023 Information Retrieval Knowledge Distillation
— Unverified 00 Embracing the Dark Knowledge: Domain Generalization Using Regularized Knowledge Distillation Jul 6, 2021 Domain Generalization image-classification
— Unverified 00 Emo Pillars: Knowledge Distillation to Support Fine-Grained Context-Aware and Context-Less Emotion Classification Apr 23, 2025 Emotion Classification GPU
— Unverified 00 Knowledge distillation for optimization of quantized deep neural networks Sep 4, 2019 Knowledge Distillation
— Unverified 00 Empirical Evaluation of Knowledge Distillation from Transformers to Subquadratic Language Models Apr 19, 2025 Knowledge Distillation State Space Models
— Unverified 00 Empowering Dual-Encoder with Query Generator for Cross-Lingual Dense Retrieval Mar 27, 2023 Knowledge Distillation Retrieval
— Unverified 00 Empowering Knowledge Distillation via Open Set Recognition for Robust 3D Point Cloud Classification Oct 25, 2020 3D Point Cloud Classification General Classification
— Unverified 00 Enabling Multimodal Generation on CLIP via Vision-Language Knowledge Distillation Nov 16, 2021 Image Captioning Knowledge Distillation
— Unverified 00 Enabling Multimodal Generation on CLIP via Vision-Language Knowledge Distillation Mar 12, 2022 Image Captioning Knowledge Distillation
— Unverified 00 Enabling Weak Client Participation via On-device Knowledge Distillation in Heterogenous Federated Learning Mar 14, 2025 Federated Learning Knowledge Distillation
— Unverified 00 EncodeNet: A Framework for Boosting DNN Accuracy with Entropy-driven Generalized Converting Autoencoder Apr 21, 2024 image-classification Image Classification
— Unverified 00 Endpoints Weight Fusion for Class Incremental Semantic Segmentation Jan 1, 2023 class-incremental learning Class Incremental Learning
— Unverified 00 End-to-End Automatic Speech Recognition with Deep Mutual Learning Feb 16, 2021 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 End-to-end fully-binarized network design: from Generic Learned Thermometer to Block Pruning May 5, 2025 Knowledge Distillation Quantization
— Unverified 00 End-to-End Simultaneous Speech Translation with Pretraining and Distillation: Huawei Noah’s System for AutoSimTranS 2022 Jul 1, 2022 Decoder Knowledge Distillation
— Unverified 00 End-to-End Speech Translation with Knowledge Distillation Apr 17, 2019 Knowledge Distillation speech-recognition
— Unverified 00 End-to-End Speech-Translation with Knowledge Distillation: FBK@IWSLT2020 Jun 4, 2020 Data Augmentation Knowledge Distillation
— Unverified 00 Energy-efficient Knowledge Distillation for Spiking Neural Networks Jun 14, 2021 Knowledge Distillation Model Compression
— Unverified 00 Enhanced Multimodal Representation Learning with Cross-modal KD Jun 13, 2023 Contrastive Learning Emotion Classification
— Unverified 00 Enhanced Sparsification via Stimulative Training Mar 11, 2024 Knowledge Distillation Model Compression
— Unverified 00 Enhancing Abstractiveness of Summarization Models through Calibrated Distillation Oct 20, 2023 Abstractive Text Summarization Informativeness
— Unverified 00 Enhancing Accuracy and Parameter-Efficiency of Neural Representations for Network Parameterization Jun 29, 2024 Knowledge Distillation
— Unverified 00 Enhancing Action Recognition from Low-Quality Skeleton Data via Part-Level Knowledge Distillation Apr 28, 2024 Action Recognition General Knowledge
— Unverified 00 Enhancing Adversarial Training with Prior Knowledge Distillation for Robust Image Compression Mar 11, 2024 Backdoor Attack Image Compression
— Unverified 00 Enhancing Chinese Multi-Label Text Classification Performance with Response-based Knowledge Distillation Nov 1, 2022 Knowledge Distillation Multi Label Text Classification
— Unverified 00 Enhancing Content Representation for AR Image Quality Assessment Using Knowledge Distillation Dec 8, 2024 Image Quality Assessment Knowledge Distillation
— Unverified 00 Enhancing CTC-Based Visual Speech Recognition Sep 11, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 Enhancing Data-Free Adversarial Distillation with Activation Regularization and Virtual Interpolation Feb 23, 2021 Knowledge Distillation
— Unverified 00 Enhancing Few-shot Keyword Spotting Performance through Pre-Trained Self-supervised Speech Models Jun 21, 2025 Dimensionality Reduction Keyword Spotting
— Unverified 00 Enhancing Generalization in Chain of Thought Reasoning for Smaller Models Jan 16, 2025 Knowledge Distillation Memorization
— Unverified 00 Enhancing Mapless Trajectory Prediction through Knowledge Distillation Jun 25, 2023 Autonomous Driving Knowledge Distillation
— Unverified 00 Enhancing Modality-Agnostic Representations via Meta-Learning for Brain Tumor Segmentation Feb 8, 2023 Brain Tumor Segmentation Image Generation
— Unverified 00 Enhancing Once-For-All: A Study on Parallel Blocks, Skip Connections and Early Exits Feb 3, 2023 All Knowledge Distillation
— Unverified 00 Enhancing Review Comprehension with Domain-Specific Commonsense Apr 6, 2020 Aspect Extraction Knowledge Distillation
— Unverified 00 Enhancing Romanian Offensive Language Detection through Knowledge Distillation, Multi-Task Learning, and Data Augmentation Sep 30, 2024 Data Augmentation Knowledge Distillation
— Unverified 00 Enhancing Scalability in Recommender Systems through Lottery Ticket Hypothesis and Knowledge Distillation-based Neural Network Pruning Jan 19, 2024 GPU Knowledge Distillation
— Unverified 00 Enhancing Semi-supervised Learning with Zero-shot Pseudolabels Feb 18, 2025 Knowledge Distillation
— Unverified 00 Enhancing Single-Slice Segmentation with 3D-to-2D Unpaired Scan Distillation Jun 18, 2024 Computed Tomography (CT) Knowledge Distillation
— Unverified 00 Enhancing SLM via ChatGPT and Dataset Augmentation Sep 19, 2024 Knowledge Distillation Natural Language Inference
— Unverified 00