Efficient Machine Translation with Model Pruning and Quantization Nov 1, 2021 CPU Decoder
— Unverified 0LightVessel: Exploring Lightweight Coronary Artery Vessel Segmentation via Similarity Knowledge Distillation Nov 2, 2022 Decoder Knowledge Distillation
— Unverified 0INDUS: Effective and Efficient Language Models for Scientific Applications May 17, 2024 Contrastive Learning Information Retrieval
— Unverified 0Industry Scale Semi-Supervised Learning for Natural Language Understanding Mar 29, 2021 intent-classification Intent Classification
— Unverified 0InfantCryNet: A Data-driven Framework for Intelligent Analysis of Infant Cries Sep 29, 2024 Knowledge Distillation Model Compression
— Unverified 0InFiConD: Interactive No-code Fine-tuning with Concept-based Knowledge Distillation Jun 25, 2024 Knowledge Distillation
— Unverified 0Combining Curriculum Learning and Knowledge Distillation for Dialogue Generation Nov 1, 2021 Dialogue Generation Knowledge Distillation
— Unverified 0Information-Theoretic GAN Compression with Variational Energy-based Model Mar 28, 2023 Image Enhancement Knowledge Distillation
— Unverified 0Combining Compressions for Multiplicative Size Scaling on Natural Language Tasks Aug 20, 2022 Knowledge Distillation Neural Network Compression
— Unverified 0ABC-KD: Attention-Based-Compression Knowledge Distillation for Deep Learning-Based Noise Suppression May 26, 2023 Knowledge Distillation
— Unverified 0Knowledge Distillation for 6D Pose Estimation by Aligning Distributions of Local Predictions May 30, 2022 6D Pose Estimation 6D Pose Estimation using RGB
— Unverified 0Knowledge Distillation for Adaptive MRI Prostate Segmentation Based on Limit-Trained Multi-Teacher Models Mar 16, 2023 Knowledge Distillation MRI segmentation
— Unverified 0InhibiDistilbert: Knowledge Distillation for a ReLU and Addition-based Transformer Mar 20, 2025 Knowledge Distillation Model Compression
— Unverified 0Initial Classifier Weights Replay for Memoryless Class Incremental Learning Aug 31, 2020 All class-incremental learning
— Unverified 0Knowledge distillation for fast and accurate DNA sequence correction Nov 17, 2022 Knowledge Distillation
— Unverified 0Efficient Knowledge Distillation via Curriculum Extraction Mar 21, 2025 Knowledge Distillation Language Modeling
— Unverified 0Efficient Knowledge Distillation of SAM for Medical Image Segmentation Jan 28, 2025 Computational Efficiency Decoder
— Unverified 0Injecting Spatial Information for Monaural Speech Enhancement via Knowledge Distillation Dec 2, 2022 Knowledge Distillation Speech Enhancement
— Unverified 0Inplace knowledge distillation with teacher assistant for improved training of flexible deep neural networks May 18, 2021 image-classification Image Classification
— Unverified 0In-situ animal behavior classification using knowledge distillation and fixed-point quantization Sep 9, 2022 Classification Knowledge Distillation
— Unverified 0Instance-aware Model Ensemble With Distillation For Unsupervised Domain Adaptation Nov 15, 2022 Domain Adaptation Knowledge Distillation
— Unverified 0Collective Wisdom: Improving Low-resource Neural Machine Translation using Adaptive Knowledge Distillation Oct 12, 2020 Knowledge Distillation Low Resource Neural Machine Translation
— Unverified 0Efficient Knowledge Distillation: Empowering Small Language Models with Teacher Model Insights Sep 19, 2024 Decision Making Knowledge Distillation
— Unverified 0Distill, Adapt, Distill: Training Small, In-Domain Models for Neural Machine Translation Mar 5, 2020 Domain Adaptation Knowledge Distillation
— Unverified 0Knowledge Distillation-based Information Sharing for Online Process Monitoring in Decentralized Manufacturing System Feb 8, 2023 Knowledge Distillation
— Unverified 0In Teacher We Trust: Learning Compressed Models for Pedestrian Detection Dec 1, 2016 Knowledge Distillation Pedestrian Detection
— Unverified 0Collective Knowledge Graph Completion with Mutual Knowledge Distillation May 25, 2023 Knowledge Distillation Knowledge Graph Completion
— Unverified 0Integrating Arithmetic Learning Improves Mathematical Reasoning in Smaller Models Feb 18, 2025 Data Augmentation GSM8K
— Unverified 0Efficient Intent-Based Filtering for Multi-Party Conversations Using Knowledge Distillation from LLMs Mar 21, 2025 intent-classification Intent Classification
— Unverified 0Integration of Pre-trained Networks with Continuous Token Interface for End-to-End Spoken Language Understanding Apr 15, 2021 intent-classification Intent Classification
— Unverified 0A Survey on Green Deep Learning Nov 8, 2021 Deep Learning Knowledge Distillation
— Unverified 0Efficient Inference via Universal LSH Kernel Jun 21, 2021 Knowledge Distillation Quantization
— Unverified 0Efficient Image Compression Using Advanced State Space Models Sep 4, 2024 Computational Efficiency Image Compression
— Unverified 0Interactive Multi-fidelity Learning for Cost-effective Adaptation of Language Model with Sparse Human Supervision Oct 31, 2023 Informativeness Knowledge Distillation
— Unverified 0Inter-KD: Intermediate Knowledge Distillation for CTC-Based Automatic Speech Recognition Nov 28, 2022 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Intermediate Distillation: Data-Efficient Distillation from Black-Box LLMs for Information Retrieval Jun 18, 2024 Information Retrieval Knowledge Distillation
— Unverified 0Interpretable discovery of new semiconductors with machine learning Jan 12, 2021 BIG-bench Machine Learning Knowledge Distillation
— Unverified 0Distillation-Enabled Knowledge Alignment for Generative Semantic Communications in AIGC Provisioning Tasks Jun 24, 2025 Knowledge Distillation Semantic Communication
— Unverified 0Knowledge Distillation based Ensemble Learning for Neural Machine Translation Jan 1, 2021 Ensemble Learning Knowledge Distillation
— Unverified 0Interpretable Traces, Unexpected Outcomes: Investigating the Disconnect in Trace-Based Knowledge Distillation May 20, 2025 Information Retrieval Knowledge Distillation
— Unverified 0Efficient Hybrid Language Model Compression through Group-Aware SSM Pruning Apr 15, 2025 Knowledge Distillation Language Modeling
— Unverified 0Bring the Power of Diffusion Model to Defect Detection Aug 25, 2024 Defect Detection Denoising
— Unverified 0Efficient Gravitational Wave Parameter Estimation via Knowledge Distillation: A ResNet1D-IAF Approach Dec 11, 2024 Astronomy Computational Efficiency
— Unverified 0Interruption-Aware Cooperative Perception for V2X Communication-Aided Autonomous Driving Apr 24, 2023 Autonomous Driving Autonomous Vehicles
— Unverified 0CoLLD: Contrastive Layer-to-layer Distillation for Compressing Multilingual Pre-trained Speech Encoders Sep 14, 2023 Contrastive Learning Knowledge Distillation
— Unverified 0Efficient Federated Learning for AIoT Applications Using Knowledge Distillation Nov 29, 2021 Federated Learning Knowledge Distillation
— Unverified 0Collaborative Teacher-Student Learning via Multiple Knowledge Transfer Jan 21, 2021 Knowledge Distillation Model Compression
— Unverified 0A survey on efficient vision transformers: algorithms, techniques, and performance benchmarking Sep 5, 2023 Benchmarking Knowledge Distillation
— Unverified 0Efficient Evaluation-Time Uncertainty Estimation by Improved Distillation Jun 12, 2019 Knowledge Distillation
— Unverified 0Collaborative Multi-Teacher Knowledge Distillation for Learning Low Bit-width Deep Neural Networks Oct 27, 2022 Knowledge Distillation Quantization
— Unverified 0