A Gift From Knowledge Distillation: Fast Optimization, Network Minimization and Transfer Learning Jul 1, 2017 Knowledge Distillation Transfer Learning
— Unverified 00 A Good Student is Cooperative and Reliable: CNN-Transformer Collaborative Learning for Semantic Segmentation Jul 24, 2023 Knowledge Distillation Semantic Segmentation
— Unverified 00 AI can evolve without labels: self-evolving vision transformer for chest X-ray diagnosis through knowledge distillation Feb 13, 2022 Deep Learning Diagnostic
— Unverified 00 AIDE: Agentically Improve Visual Language Model with Domain Experts Feb 13, 2025 Knowledge Distillation Language Modeling
— Unverified 00 AI-KD: Adversarial learning and Implicit regularization for self-Knowledge Distillation Nov 20, 2022 Knowledge Distillation Self-Knowledge Distillation
— Unverified 00 AirNet: Neural Network Transmission over the Air May 24, 2021 Knowledge Distillation
— Unverified 00 A Joint Sequential and Relational Model for Frame-Semantic Parsing Sep 1, 2017 Knowledge Distillation Machine Translation
— Unverified 00 AKD : Adversarial Knowledge Distillation For Large Language Models Alignment on Coding tasks May 5, 2025 Code Completion Code Generation
— Unverified 00 A Knowledge Distillation Approach for Sepsis Outcome Prediction from Multivariate Clinical Time Series Nov 16, 2023 Knowledge Distillation Time Series
— Unverified 00 A Knowledge Distillation-Based Backdoor Attack in Federated Learning Aug 12, 2022 Backdoor Attack Federated Learning
— Unverified 00 A Knowledge Distillation framework for Multi-Organ Segmentation of Medaka Fish in Tomographic Image Feb 24, 2023 Computed Tomography (CT) Image Segmentation
— Unverified 00 A Light-weight Deep Learning Model for Remote Sensing Image Classification Feb 25, 2023 image-classification Image Classification
— Unverified 00 A Lightweight Domain Adversarial Neural Network Based on Knowledge Distillation for EEG-based Cross-subject Emotion Recognition May 12, 2023 EEG Electroencephalogram (EEG)
— Unverified 00 A Lightweight Low-Light Image Enhancement Network via Channel Prior and Gamma Correction Feb 28, 2024 Image Enhancement Knowledge Distillation
— Unverified 00 A lightweight network for photovoltaic cell defect detection in electroluminescence images based on neural architecture search and knowledge distillation Feb 15, 2023 Data Augmentation Defect Detection
— Unverified 00 AligNART: Non-autoregressive Neural Machine Translation by Jointly Learning to Estimate Alignment and Translate Sep 14, 2021 Decoder Knowledge Distillation
— Unverified 00 AlignCap: Aligning Speech Emotion Captioning to Human Preferences Oct 24, 2024 Knowledge Distillation Language Modeling
— Unverified 00 Aligned Weight Regularizers for Pruning Pretrained Neural Networks Nov 16, 2021 Knowledge Distillation Language Modeling
— Unverified 00 Aligning in a Compact Space: Contrastive Knowledge Distillation between Heterogeneous Architectures May 28, 2024 Contrastive Learning Knowledge Distillation
— Unverified 00 Aligning Teacher with Student Preferences for Tailored Training Data Generation Jun 27, 2024 In-Context Learning Knowledge Distillation
— Unverified 00 Alignment Knowledge Distillation for Online Streaming Attention-based Speech Recognition Feb 28, 2021 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 Alleviating Catastrophic Forgetting of Incremental Object Detection via Within-Class and Between-Class Knowledge Distillation Jan 1, 2023 Knowledge Distillation object-detection
— Unverified 00 Alleviating LLM-based Generative Retrieval Hallucination in Alipay Search Mar 27, 2025 Hallucination Knowledge Distillation
— Unverified 00 All You Need in Knowledge Distillation Is a Tailored Coordinate System Dec 12, 2024 All Few-Shot Learning
— Unverified 00 ALP-KD: Attention-Based Layer Projection for Knowledge Distillation Dec 27, 2020 Knowledge Distillation
— Unverified 00 Always Strengthen Your Strengths: A Drift-Aware Incremental Learning Framework for CTR Prediction Apr 17, 2023 Click-Through Rate Prediction Diversity
— Unverified 00 AMD: Adaptive Masked Distillation for Object Detection Jan 31, 2023 Knowledge Distillation Model Compression
— Unverified 00 AMD: Automatic Multi-step Distillation of Large-scale Vision Models Jul 5, 2024 image-classification Image Classification
— Unverified 00 A method for estimating forest carbon storage distribution density via artificial intelligence generated content model Feb 2, 2025 Knowledge Distillation
— Unverified 00 A metric learning approach for endoscopic kidney stone identification Jul 13, 2023 Few-Shot Learning Knowledge Distillation
— Unverified 00 AMLN: Adversarial-based Mutual Learning Network for Online Knowledge Distillation Aug 1, 2020 Knowledge Distillation Transfer Learning
— Unverified 00 Amortized Noisy Channel Neural Machine Translation Dec 16, 2021 Imitation Learning Knowledge Distillation
— Unverified 00 AMTSS: An Adaptive Multi-Teacher Single-Student Knowledge Distillation Framework For Multilingual Language Inference May 13, 2023 Knowledge Distillation
— Unverified 00 An Active Learning Framework for Inclusive Generation by Large Language Models Oct 17, 2024 Active Learning Clustering
— Unverified 00 Analyzing Compression Techniques for Computer Vision May 14, 2023 Knowledge Distillation Quantization
— Unverified 00 Analyzing Knowledge Distillation in Neural Machine Translation Oct 1, 2018 Knowledge Distillation Machine Translation
— Unverified 00 Analyzing the Importance of Blank for CTC-Based Knowledge Distillation Jun 2, 2025 Automatic Speech Recognition Knowledge Distillation
— Unverified 00 An Effective Deep Network for Head Pose Estimation without Keypoints Oct 25, 2022 Gaze Estimation Head Pose Estimation
— Unverified 00 An Efficient Active Learning Pipeline for Legal Text Classification Nov 15, 2022 Active Learning Classification
— Unverified 00 An Efficient Detection and Control System for Underwater Docking using Machine Learning and Realistic Simulation: A Comprehensive Approach Nov 2, 2023 Generative Adversarial Network Image-to-Image Translation
— Unverified 00 An Efficient Federated Distillation Learning System for Multi-task Time Series Classification Dec 30, 2021 Knowledge Distillation Time Series
— Unverified 00 An Efficient Method of Training Small Models for Regression Problems with Knowledge Distillation Feb 28, 2020 Knowledge Distillation Memorization
— Unverified 00 An Efficient Private GPT Never Autoregressively Decodes May 21, 2025 Knowledge Distillation
— Unverified 00 An Empirical Analysis of the Impact of Data Augmentation on Knowledge Distillation Jun 6, 2020 Data Augmentation Knowledge Distillation
— Unverified 00 An Empirical Investigation into the Effect of Parameter Choices in Knowledge Distillation Jan 12, 2024 Knowledge Distillation
— Unverified 00 An Empirical Study of Efficient ASR Rescoring with Transformers Oct 24, 2019 Knowledge Distillation Language Modeling
— Unverified 00 An Empirical Study of Leveraging Knowledge Distillation for Compressing Multilingual Neural Machine Translation Models Apr 19, 2023 Knowledge Distillation Machine Translation
— Unverified 00 An Empirical Study of Uniform-Architecture Knowledge Distillation in Document Ranking Feb 8, 2023 Document Ranking Knowledge Distillation
— Unverified 00 An Enhanced Low-Resolution Image Recognition Method for Traffic Environments Sep 28, 2023 Computational Efficiency Knowledge Distillation
— Unverified 00 An Ensemble of Knowledge Sharing Models for Dynamic Hand Gesture Recognition Aug 13, 2020 Gesture Recognition Hand Gesture Recognition
— Unverified 00