High-Fidelity Pseudo-label Generation by Large Language Models for Training Robust Radiology Report Classifiers May 3, 2025 Diagnostic Knowledge Distillation
— Unverified 00 High-dimensional Analysis of Knowledge Distillation: Weak-to-Strong Generalization and Scaling Laws Oct 24, 2024 Knowledge Distillation regression
— Unverified 00 Hierarchical Transformer-based Large-Context End-to-end ASR with Large-Context Knowledge Distillation Feb 16, 2021 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 Hierarchical Selective Classification May 19, 2024 Classification Knowledge Distillation
— Unverified 00 Hierarchical Knowledge Distillation on Text Graph for Data-limited Attribute Inference Jan 10, 2024 Attribute Few-Shot Learning
— Unverified 00 How Does Distilled Data Complexity Impact the Quality and Confidence of Non-Autoregressive Machine Translation? May 27, 2021 Diversity Knowledge Distillation
— Unverified 00 Hierarchical Knowledge Distillation for Dialogue Sequence Labeling Nov 22, 2021 Knowledge Distillation Scene Segmentation
— Unverified 00 Deep Collective Knowledge Distillation Apr 18, 2023 Knowledge Distillation Model Compression
— Unverified 00 A metric learning approach for endoscopic kidney stone identification Jul 13, 2023 Few-Shot Learning Knowledge Distillation
— Unverified 00 How to Backdoor the Knowledge Distillation Apr 30, 2025 Knowledge Distillation
— Unverified 00 HFedCKD: Toward Robust Heterogeneous Federated Learning via Data-free Knowledge Distillation and Two-way Contrast Mar 9, 2025 Data-free Knowledge Distillation Federated Learning
— Unverified 00 Heterogeneous Generative Knowledge Distillation with Masked Image Modeling Sep 18, 2023 image-classification Image Classification
— Unverified 00 Heterogeneous Federated Learning Using Knowledge Codistillation Oct 4, 2023 Federated Learning image-classification
— Unverified 00 Heterogeneous Federated Knowledge Graph Embedding Learning and Unlearning Feb 4, 2023 Federated Learning Graph Embedding
— Unverified 00 Heterogeneous Continual Learning Jun 14, 2023 Continual Learning Knowledge Distillation
— Unverified 00 Heterogeneous-Branch Collaborative Learning for Dialogue Generation Mar 21, 2023 Attribute Dialogue Generation
— Unverified 00 A method for estimating forest carbon storage distribution density via artificial intelligence generated content model Feb 2, 2025 Knowledge Distillation
— Unverified 00 Adaptive Multiplane Image Generation from a Single Internet Picture Nov 26, 2020 Depth Estimation Image Generation
— Unverified 00 A Closer Look at Rehearsal-Free Continual Learning Mar 31, 2022 Continual Learning Knowledge Distillation
— Unverified 00 Heterogeneity-aware Personalized Federated Learning via Adaptive Dual-Agent Reinforcement Learning Jan 28, 2025 Federated Learning Knowledge Distillation
— Unverified 00 HeteFedRec: Federated Recommender Systems with Model Heterogeneity Jul 24, 2023 Knowledge Distillation model
— Unverified 00 Decoupling Dark Knowledge via Block-wise Logit Distillation for Feature-level Alignment Nov 3, 2024 Knowledge Distillation Philosophy
— Unverified 00 HEAT: Hardware-Efficient Automatic Tensor Decomposition for Transformer Compression Nov 30, 2022 Efficient Exploration Knowledge Distillation
— Unverified 00 Human in the Latent Loop (HILL): Interactively Guiding Model Training Through Human Intuition May 9, 2025 Knowledge Distillation
— Unverified 00 Hearing Lips: Improving Lip Reading by Distilling Speech Recognizers Nov 26, 2019 Knowledge Distillation Lipreading
— Unverified 00 Decouple Non-parametric Knowledge Distillation For End-to-end Speech Translation Apr 20, 2023 Knowledge Distillation Machine Translation
— Unverified 00 HW-TSC’s Participation in the WMT 2020 News Translation Shared Task Nov 1, 2020 Knowledge Distillation Translation
— Unverified 00 HW-TSC’s Participation in the WMT 2021 Large-Scale Multilingual Translation Task Nov 1, 2021 Knowledge Distillation Translation
— Unverified 00 Head-Tail-Aware KL Divergence in Knowledge Distillation for Spiking Neural Networks Apr 29, 2025 Knowledge Distillation Transfer Learning
— Unverified 00 Decoupled Transformer for Scalable Inference in Open-domain Question Answering Sep 1, 2021 Knowledge Distillation Machine Reading Comprehension
— Unverified 00 Headache to Overstock? Promoting Long-tail Items through Debiased Product Bundling Nov 28, 2024 Knowledge Distillation Navigate
— Unverified 00 Decoupled Transformer for Scalable Inference in Open-domain Question Answering Aug 5, 2021 Knowledge Distillation Machine Reading Comprehension
— Unverified 00 Biologically inspired structure learning with reverse knowledge distillation for spiking neural networks Apr 19, 2023 Knowledge Distillation
— Unverified 00 AMD: Automatic Multi-step Distillation of Large-scale Vision Models Jul 5, 2024 image-classification Image Classification
— Unverified 00 hdl2v: A Code Translation Dataset for Enhanced LLM Verilog Generation Jun 5, 2025 Code Generation Code Translation
— Unverified 00 Spectral Maps for Learning on Subgraphs May 30, 2022 Graph Learning Knowledge Distillation
— Unverified 00 Harnessing Increased Client Participation with Cohort-Parallel Federated Learning May 24, 2024 Federated Learning image-classification
— Unverified 00 Harmonizing knowledge Transfer in Neural Network with Unified Distillation Sep 27, 2024 Knowledge Distillation Transfer Learning
— Unverified 00 HARD: Hard Augmentations for Robust Distillation May 24, 2023 Data Augmentation Domain Generalization
— Unverified 00 I2D2: Inductive Knowledge Distillation with NeuroLogic and Self-Imitation Dec 19, 2022 Imitation Learning Knowledge Distillation
— Unverified 00 I^2KD-SLU: An Intra-Inter Knowledge Distillation Framework for Zero-Shot Cross-Lingual Spoken Language Understanding Oct 4, 2023 Intent Detection Knowledge Distillation
— Unverified 00 Hard Gate Knowledge Distillation -- Leverage Calibration for Robust and Reliable Language Model Oct 22, 2022 Knowledge Distillation Language Modeling
— Unverified 00 IAG: Induction-Augmented Generation Framework for Answering Reasoning Questions Nov 30, 2023 Knowledge Distillation RAG
— Unverified 00 ICD-Face: Intra-class Compactness Distillation for Face Recognition Jan 1, 2023 Face Recognition Knowledge Distillation
— Unverified 00 BiM-VFI: Bidirectional Motion Field-Guided Frame Interpolation for Video with Non-uniform Motions Jan 1, 2025 Knowledge Distillation Motion Estimation
— Unverified 00 AMD: Adaptive Masked Distillation for Object Detection Jan 31, 2023 Knowledge Distillation Model Compression
— Unverified 00 HanjaBridge: Resolving Semantic Ambiguity in Korean LLMs via Hanja-Augmented Pre-Training Jul 15, 2025 Cross-Lingual Transfer Knowledge Distillation
— Unverified 00 If At First You Don't Succeed: Test Time Re-ranking for Zero-shot, Cross-domain Retrieval Mar 30, 2023 Image Retrieval Knowledge Distillation
— Unverified 00 Hands-on Guidance for Distilling Object Detectors Mar 26, 2021 Knowledge Distillation Object
— Unverified 00 Decoupled Alignment for Robust Plug-and-Play Adaptation Jun 3, 2024 Knowledge Distillation
— Unverified 00