High Performance Natural Language Processing Nov 1, 2020 Knowledge Distillation Quantization
— Unverified 00 Hint-dynamic Knowledge Distillation Nov 30, 2022 Knowledge Distillation
— Unverified 00 HIRE: Distilling High-order Relational Knowledge From Heterogeneous Graph Neural Networks Jul 25, 2022 Knowledge Distillation Vocal Bursts Intensity Prediction
— Unverified 00 HKD4VLM: A Progressive Hybrid Knowledge Distillation Framework for Robust Multimodal Hallucination and Factuality Detection in VLMs Jun 16, 2025 Hallucination Knowledge Distillation
— Unverified 00 Holistic Approach to Measure Sample-level Adversarial Vulnerability and its Utility in Building Trustworthy Systems May 5, 2022 Adversarial Attack Knowledge Distillation
— Unverified 00 HomoDistil: Homotopic Task-Agnostic Distillation of Pre-trained Transformers Feb 19, 2023 Knowledge Distillation Model Compression
— Unverified 00 Homogenizing Non-IID datasets via In-Distribution Knowledge Distillation for Decentralized Learning Apr 9, 2023 image-classification Image Classification
— Unverified 00 HoverFast: an accurate, high-throughput, clinically deployable nuclear segmentation tool for brightfield digital pathology images May 22, 2024 GPU Knowledge Distillation
— Unverified 00 How and When Adversarial Robustness Transfers in Knowledge Distillation? Oct 22, 2021 Adversarial Robustness Knowledge Distillation
— Unverified 00 How Does Distilled Data Complexity Impact the Quality and Confidence of Non-Autoregressive Machine Translation? May 27, 2021 Diversity Knowledge Distillation
— Unverified 00 How many Observations are Enough? Knowledge Distillation for Trajectory Forecasting Mar 9, 2022 Knowledge Distillation Trajectory Forecasting
— Unverified 00 How Redundant Is the Transformer Stack in Speech Representation Models? Sep 10, 2024 Knowledge Distillation Speaker Identification
— Unverified 00 How to Backdoor the Knowledge Distillation Apr 30, 2025 Knowledge Distillation
— Unverified 00 How to Prune Your Language Model: Recovering Accuracy on the "Sparsity May Cry'' Benchmark Dec 21, 2023 Knowledge Distillation Language Modeling
— Unverified 00 How to Select One Among All ? An Empirical Study Towards the Robustness of Knowledge Distillation in Natural Language Understanding Nov 1, 2021 Adversarial Robustness All
— Unverified 00 HRPose: Real-Time High-Resolution 6D Pose Estimation Network Using Knowledge Distillation Apr 20, 2022 6D Pose Estimation 6D Pose Estimation using RGB
— Unverified 00 Human-Centered Prior-Guided and Task-Dependent Multi-Task Representation Learning for Action Recognition Pre-Training Apr 27, 2022 Action Recognition Contrastive Learning
— Unverified 00 Human Insights Driven Latent Space for Different Driving Perspectives: A Unified Encoder for Efficient Multi-Task Inference Sep 16, 2024 Autonomous Driving Knowledge Distillation
— Unverified 00 Human in the Latent Loop (HILL): Interactively Guiding Model Training Through Human Intuition May 9, 2025 Knowledge Distillation
— Unverified 00 Human-Like Active Learning: Machines Simulating the Human Learning Process Nov 7, 2020 Active Learning Form
— Unverified 00 HW-TSC’s Participation in the WMT 2020 News Translation Shared Task Nov 1, 2020 Knowledge Distillation Translation
— Unverified 00 HW-TSC’s Participation in the WMT 2021 Large-Scale Multilingual Translation Task Nov 1, 2021 Knowledge Distillation Translation
— Unverified 00 HW-TSC’s Participation in the WMT 2021 News Translation Shared Task Nov 1, 2021 de-en Knowledge Distillation
— Unverified 00 Hybrid Paradigm-based Brain-Computer Interface for Robotic Arm Control Dec 14, 2022 Brain Computer Interface EEG
— Unverified 00 HYDRA-FL: Hybrid Knowledge Distillation for Robust and Accurate Federated Learning Sep 30, 2024 Federated Learning Knowledge Distillation
— Unverified 00 HyperINR: A Fast and Predictive Hypernetwork for Implicit Neural Representations via Knowledge Distillation Apr 9, 2023 Knowledge Distillation Novel View Synthesis
— Unverified 00 Hyperspectral Image Analysis in Single-Modal and Multimodal setting using Deep Learning Techniques Mar 3, 2024 Dimensionality Reduction Hyperspectral image analysis
— Unverified 00 I2CKD : Intra- and Inter-Class Knowledge Distillation for Semantic Segmentation Mar 27, 2024 Knowledge Distillation Segmentation
— Unverified 00 I2D2: Inductive Knowledge Distillation with NeuroLogic and Self-Imitation Dec 19, 2022 Imitation Learning Knowledge Distillation
— Unverified 00 I^2KD-SLU: An Intra-Inter Knowledge Distillation Framework for Zero-Shot Cross-Lingual Spoken Language Understanding Oct 4, 2023 Intent Detection Knowledge Distillation
— Unverified 00 IAG: Induction-Augmented Generation Framework for Answering Reasoning Questions Nov 30, 2023 Knowledge Distillation RAG
— Unverified 00 ICD-Face: Intra-class Compactness Distillation for Face Recognition Jan 1, 2023 Face Recognition Knowledge Distillation
— Unverified 00 Cross-resolution Face Recognition via Identity-Preserving Network and Knowledge Distillation Mar 15, 2023 Face Recognition Knowledge Distillation
— Unverified 00 If At First You Don't Succeed: Test Time Re-ranking for Zero-shot, Cross-domain Retrieval Mar 30, 2023 Image Retrieval Knowledge Distillation
— Unverified 00 IIE’s Neural Machine Translation Systems for WMT20 Nov 1, 2020 Domain Adaptation Knowledge Distillation
— Unverified 00 IKD+: Reliable Low Complexity Deep Models For Retinopathy Classification Mar 4, 2023 Classification Knowledge Distillation
— Unverified 00 IL-NeRF: Incremental Learning for Neural Radiance Fields with Camera Pose Alignment Dec 10, 2023 Incremental Learning Knowledge Distillation
— Unverified 00 Image Restoration using Feature-guidance Jan 1, 2022 Image Restoration Knowledge Distillation
— Unverified 00 Image-to-Video Re-Identification via Mutual Discriminative Knowledge Transfer Jan 21, 2022 Knowledge Distillation Transfer Learning
— Unverified 00 Attention-based Knowledge Distillation in Multi-attention Tasks: The Impact of a DCT-driven Loss May 4, 2022 Descriptive Knowledge Distillation
— Unverified 00 Implicit Word Reordering with Knowledge Distillation for Cross-Lingual Dependency Parsing Feb 24, 2025 Cross-Lingual Transfer Dependency Parsing
— Unverified 00 Impossible Triangle: What's Next for Pre-trained Language Models? Apr 13, 2022 Data Augmentation Few-Shot Learning
— Unverified 00 Improved Cross-Lingual Transfer Learning For Automatic Speech Translation Jun 1, 2023 automatic-speech-translation Cross-Lingual Transfer
— Unverified 00 Improved Customer Transaction Classification using Semi-Supervised Knowledge Distillation Feb 15, 2021 Classification General Classification
— Unverified 00 Improved implicit diffusion model with knowledge distillation to estimate the spatial distribution density of carbon stock in remote sensing imagery Nov 27, 2024 Knowledge Distillation
— Unverified 00 Improved knowledge distillation by utilizing backward pass knowledge in neural networks Jan 27, 2023 Knowledge Distillation Model Compression
— Unverified 00 Improved Knowledge Distillation for Pre-trained Language Models via Knowledge Selection Feb 1, 2023 Knowledge Distillation
— Unverified 00 Improved Knowledge Distillation via Adversarial Collaboration Nov 29, 2021 Knowledge Distillation
— Unverified 00 Improved Methods for Model Pruning and Knowledge Distillation May 20, 2025 Knowledge Distillation
— Unverified 00 Improved Synthetic Training for Reading Comprehension Oct 24, 2020 Knowledge Distillation Machine Reading Comprehension
— Unverified 00