Embedded Knowledge Distillation in Depth-Level Dynamic Neural Network Mar 1, 2021 Dynamic neural networks Knowledge Distillation
— Unverified 0ELiTe: Efficient Image-to-LiDAR Knowledge Transfer for Semantic Segmentation May 7, 2024 Knowledge Distillation LIDAR Semantic Segmentation
— Unverified 0Comparison of Soft and Hard Target RNN-T Distillation for Large-scale ASR Oct 11, 2022 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Handling Long-tailed Feature Distribution in AdderNets Dec 1, 2021 Knowledge Distillation
— Unverified 0Hands-on Guidance for Distilling Object Detectors Mar 26, 2021 Knowledge Distillation Object
— Unverified 0HanjaBridge: Resolving Semantic Ambiguity in Korean LLMs via Hanja-Augmented Pre-Training Jul 15, 2025 Cross-Lingual Transfer Knowledge Distillation
— Unverified 0ADPS: Asymmetric Distillation Post-Segmentation for Image Anomaly Detection Oct 19, 2022 Anomaly Detection Anomaly Localization
— Unverified 0HARD: Hard Augmentations for Robust Distillation May 24, 2023 Data Augmentation Domain Generalization
— Unverified 0VizECGNet: Visual ECG Image Network for Cardiovascular Diseases Classification with Multi-Modal Training and Knowledge Distillation Aug 6, 2024 ECG Classification Knowledge Distillation
— Unverified 0Harmonizing knowledge Transfer in Neural Network with Unified Distillation Sep 27, 2024 Knowledge Distillation Transfer Learning
— Unverified 0ELAICHI: Enhancing Low-resource TTS by Addressing Infrequent and Low-frequency Character Bigrams Oct 23, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0ELAD: Explanation-Guided Large Language Models Active Distillation Feb 20, 2024 Active Learning Knowledge Distillation
— Unverified 0EI-MTD:Moving Target Defense for Edge Intelligence against Adversarial Attacks Sep 19, 2020 Knowledge Distillation Scheduling
— Unverified 0hdl2v: A Code Translation Dataset for Enhanced LLM Verilog Generation Jun 5, 2025 Code Generation Code Translation
— Unverified 0Headache to Overstock? Promoting Long-tail Items through Debiased Product Bundling Nov 28, 2024 Knowledge Distillation Navigate
— Unverified 0IOR: Inversed Objects Replay for Incremental Object Detection Jun 7, 2024 Knowledge Distillation Object
— Unverified 0Comparing Fisher Information Regularization with Distillation for DNN Quantization Oct 19, 2020 Knowledge Distillation Quantization
— Unverified 0Head-Tail-Aware KL Divergence in Knowledge Distillation for Spiking Neural Networks Apr 29, 2025 Knowledge Distillation Transfer Learning
— Unverified 0Incorporating Ultrasound Tongue Images for Audio-Visual Speech Enhancement through Knowledge Distillation May 24, 2023 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0HEAT: Hardware-Efficient Automatic Tensor Decomposition for Transformer Compression Nov 30, 2022 Efficient Exploration Knowledge Distillation
— Unverified 0HeteFedRec: Federated Recommender Systems with Model Heterogeneity Jul 24, 2023 Knowledge Distillation model
— Unverified 0Heterogeneity-aware Personalized Federated Learning via Adaptive Dual-Agent Reinforcement Learning Jan 28, 2025 Federated Learning Knowledge Distillation
— Unverified 0Incorporating Ultrasound Tongue Images for Audio-Visual Speech Enhancement Sep 19, 2023 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Heterogeneous Continual Learning Jun 14, 2023 Continual Learning Knowledge Distillation
— Unverified 0Incremental Learning for End-to-End Automatic Speech Recognition May 11, 2020 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Heterogeneous Federated Learning Using Knowledge Codistillation Oct 4, 2023 Federated Learning image-classification
— Unverified 0ESGN: Efficient Stereo Geometry Network for Fast 3D Object Detection Nov 28, 2021 3D Object Detection Knowledge Distillation
— Unverified 0Active Learning for Lane Detection: A Knowledge Distillation Approach Jan 1, 2021 2D Object Detection Active Learning
— Unverified 0HFedCKD: Toward Robust Heterogeneous Federated Learning via Data-free Knowledge Distillation and Two-way Contrast Mar 9, 2025 Data-free Knowledge Distillation Federated Learning
— Unverified 0Asymmetric Decision-Making in Online Knowledge Distillation:Unifying Consensus and Divergence Mar 9, 2025 Decision Making Knowledge Distillation
— Unverified 0Improving Video Model Transfer With Dynamic Representation Learning Jan 1, 2022 Action Classification Knowledge Distillation
— Unverified 0Hierarchical Knowledge Distillation on Text Graph for Data-limited Attribute Inference Jan 10, 2024 Attribute Few-Shot Learning
— Unverified 0Hierarchical Selective Classification May 19, 2024 Classification Knowledge Distillation
— Unverified 0EfficientViT-SAM: Accelerated Segment Anything Model Without Accuracy Loss Feb 7, 2024 Decoder GPU
— Unverified 0Compact Speaker Embedding: lrx-vector Aug 11, 2020 Knowledge Distillation Speaker Recognition
— Unverified 0Efficient Video Segmentation Models with Per-frame Inference Feb 24, 2022 Image Matting Instance Segmentation
— Unverified 0Efficient Verified Machine Unlearning For Distillation Mar 28, 2025 Knowledge Distillation Machine Unlearning
— Unverified 0Discovery of novel antimicrobial peptides with notable antibacterial potency by a LLM-based foundation model Jul 17, 2024 Knowledge Distillation scientific discovery
— Unverified 0Efficient Transformer Knowledge Distillation: A Performance Review Nov 22, 2023 Knowledge Distillation Model Compression
— Unverified 0High Performance Natural Language Processing Nov 1, 2020 Knowledge Distillation Quantization
— Unverified 0Efficient Transformer-based Large Scale Language Representations using Hardware-friendly Block Structured Pruning Sep 17, 2020 Edge-computing Knowledge Distillation
— Unverified 0Hint-dynamic Knowledge Distillation Nov 30, 2022 Knowledge Distillation
— Unverified 0Compacting Deep Neural Networks for Internet of Things: Methods and Applications Mar 20, 2021 Diversity Knowledge Distillation
— Unverified 0Efficient training of lightweight neural networks using Online Self-Acquired Knowledge Distillation Aug 26, 2021 Density Estimation Knowledge Distillation
— Unverified 0Compact CNN Structure Learning by Knowledge Distillation Apr 19, 2021 Knowledge Distillation Model Compression
— Unverified 0HKD4VLM: A Progressive Hybrid Knowledge Distillation Framework for Robust Multimodal Hallucination and Factuality Detection in VLMs Jun 16, 2025 Hallucination Knowledge Distillation
— Unverified 0A Survey on Transformer Compression Feb 5, 2024 Knowledge Distillation Mamba
— Unverified 0Improving Text-based Early Prediction by Distillation from Privileged Time-Series Text Jan 26, 2023 Knowledge Distillation Prediction
— Unverified 0Deep Learning for Medical Text Processing: BERT Model Fine-Tuning and Comparative Study Oct 28, 2024 Knowledge Distillation
— Unverified 0Compact CNN Models for On-device Ocular-based User Recognition in Mobile Devices Oct 11, 2021 Knowledge Distillation Network Pruning
— Unverified 0