AfroXLMR-Comet: Multilingual Knowledge Distillation with Attention Matching for Low-Resource languages Feb 25, 2025 Knowledge Distillation Language Modeling
— Unverified 0Improving Cone-Beam CT Image Quality with Knowledge Distillation-Enhanced Diffusion Model in Imbalanced Data Settings Sep 19, 2024 Computed Tomography (CT) Image Generation
— Unverified 0Improving Conversational Abilities of Quantized Large Language Models via Direct Preference Alignment Jul 3, 2024 Chatbot Computational Efficiency
— Unverified 0Empowering Dual-Encoder with Query Generator for Cross-Lingual Dense Retrieval Mar 27, 2023 Knowledge Distillation Retrieval
— Unverified 0Empirical Evaluation of Knowledge Distillation from Transformers to Subquadratic Language Models Apr 19, 2025 Knowledge Distillation State Space Models
— Unverified 0GOLD: Generalized Knowledge Distillation via Out-of-Distribution-Guided Language Data Generation Mar 28, 2024 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 0Data-Free Knowledge Transfer: A Survey Dec 31, 2021 Data-free Knowledge Distillation Domain Adaptation
— Unverified 0Complete-to-Partial 4D Distillation for Self-Supervised Point Cloud Sequence Representation Learning Dec 10, 2022 Knowledge Distillation Representation Learning
— Unverified 0Knowledge distillation for optimization of quantized deep neural networks Sep 4, 2019 Knowledge Distillation
— Unverified 0Emo Pillars: Knowledge Distillation to Support Fine-Grained Context-Aware and Context-Less Emotion Classification Apr 23, 2025 Emotion Classification GPU
— Unverified 0A Framework for Double-Blind Federated Adaptation of Foundation Models Feb 3, 2025 Federated Learning image-classification
— Unverified 0Embracing the Dark Knowledge: Domain Generalization Using Regularized Knowledge Distillation Jul 6, 2021 Domain Generalization image-classification
— Unverified 0EmbedDistill: A Geometric Knowledge Distillation for Information Retrieval Jan 27, 2023 Information Retrieval Knowledge Distillation
— Unverified 0Completely Heterogeneous Federated Learning Oct 28, 2022 Data-free Knowledge Distillation Federated Learning
— Unverified 0Data Techniques For Online End-to-end Speech Recognition Jan 24, 2020 Data Augmentation Domain Adaptation
— Unverified 0Gradient Reweighting: Towards Imbalanced Class-Incremental Learning Feb 28, 2024 class-incremental learning Class Incremental Learning
— Unverified 0Embedding Compression for Teacher-to-Student Knowledge Transfer Feb 9, 2024 Knowledge Distillation Transfer Learning
— Unverified 0Graph-Adaptive Pruning for Efficient Inference of Convolutional Neural Networks Nov 21, 2018 Knowledge Distillation Model Compression
— Unverified 0Asymmetric Image Retrieval with Cross Model Compatible Ensembles Mar 30, 2023 Diversity Face Recognition
— Unverified 0ABKD: Graph Neural Network Compression with Attention-Based Knowledge Distillation Oct 24, 2023 Drug Discovery Fake News Detection
— Unverified 0Beyond the Tip of Efficiency: Uncovering the Submerged Threats of Jailbreak Attacks in Small Language Models Feb 27, 2025 Knowledge Distillation Model Compression
— Unverified 0KNIFE: Distilling Reasoning Knowledge From Free-Text Rationales Dec 19, 2022 Knowledge Distillation Language Modelling
— Unverified 0Embedded Knowledge Distillation in Depth-Level Dynamic Neural Network Mar 1, 2021 Dynamic neural networks Knowledge Distillation
— Unverified 0ELiTe: Efficient Image-to-LiDAR Knowledge Transfer for Semantic Segmentation May 7, 2024 Knowledge Distillation LIDAR Semantic Segmentation
— Unverified 0Comparison of Soft and Hard Target RNN-T Distillation for Large-scale ASR Oct 11, 2022 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0ADPS: Asymmetric Distillation Post-Segmentation for Image Anomaly Detection Oct 19, 2022 Anomaly Detection Anomaly Localization
— Unverified 0VizECGNet: Visual ECG Image Network for Cardiovascular Diseases Classification with Multi-Modal Training and Knowledge Distillation Aug 6, 2024 ECG Classification Knowledge Distillation
— Unverified 0ELAICHI: Enhancing Low-resource TTS by Addressing Infrequent and Low-frequency Character Bigrams Oct 23, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0ELAD: Explanation-Guided Large Language Models Active Distillation Feb 20, 2024 Active Learning Knowledge Distillation
— Unverified 0EI-MTD:Moving Target Defense for Edge Intelligence against Adversarial Attacks Sep 19, 2020 Knowledge Distillation Scheduling
— Unverified 0AKE-GNN: Effective Graph Learning with Adaptive Knowledge Exchange Jun 10, 2021 Classification Graph Classification
— Unverified 0Dealing with Missing Modalities in the Visual Question Answer-Difference Prediction Task through Knowledge Distillation Apr 13, 2021 Knowledge Distillation Triplet
— Unverified 0IOR: Inversed Objects Replay for Incremental Object Detection Jun 7, 2024 Knowledge Distillation Object
— Unverified 0Comparing Fisher Information Regularization with Distillation for DNN Quantization Oct 19, 2020 Knowledge Distillation Quantization
— Unverified 0Ground-V: Teaching VLMs to Ground Complex Instructions in Pixels May 20, 2025 Instruction Following Knowledge Distillation
— Unverified 0Group channel pruning and spatial attention distilling for object detection Jun 2, 2023 Knowledge Distillation Model Compression
— Unverified 0Improving Autoregressive NMT with Non-Autoregressive Model Jul 1, 2020 Decoder de-en
— Unverified 0Grouped Knowledge Distillation for Deep Face Recognition Apr 10, 2023 Face Recognition Knowledge Distillation
— Unverified 0Improving CLIP Robustness with Knowledge Distillation and Self-Training Sep 19, 2023 Knowledge Distillation
— Unverified 0Group-Mix SAM: Lightweight Solution for Industrial Assembly Line Applications Mar 15, 2024 Knowledge Distillation
— Unverified 0Improving Defensive Distillation using Teacher Assistant May 14, 2023 Face Recognition Knowledge Distillation
— Unverified 0Improving Mathematical Reasoning Capabilities of Small Language Models via Feedback-Driven Distillation Nov 22, 2024 Knowledge Distillation Mathematical Reasoning
— Unverified 0ESGN: Efficient Stereo Geometry Network for Fast 3D Object Detection Nov 28, 2021 3D Object Detection Knowledge Distillation
— Unverified 0Active Learning for Lane Detection: A Knowledge Distillation Approach Jan 1, 2021 2D Object Detection Active Learning
— Unverified 0Asymmetric Decision-Making in Online Knowledge Distillation:Unifying Consensus and Divergence Mar 9, 2025 Decision Making Knowledge Distillation
— Unverified 0Improved training of binary networks for human pose estimation and image recognition Apr 11, 2019 Binarization Classification with Binary Neural Network
— Unverified 0Guiding CTC Posterior Spike Timings for Improved Posterior Fusion and Knowledge Distillation Apr 17, 2019 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Decision Boundary-aware Knowledge Consolidation Generates Better Instance-Incremental Learner Jun 5, 2024 class-incremental learning Class Incremental Learning
— Unverified 0Guiding Teacher Forcing with Seer Forcing for Neural Machine Translation Jun 12, 2021 Decoder Knowledge Distillation
— Unverified 0EfficientViT-SAM: Accelerated Segment Anything Model Without Accuracy Loss Feb 7, 2024 Decoder GPU
— Unverified 0