GOLD: Generalized Knowledge Distillation via Out-of-Distribution-Guided Language Data Generation Mar 28, 2024 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 00 GOVERN: Gradient Orientation Vote Ensemble for Multi-Teacher Reinforced Distillation May 6, 2024 Knowledge Distillation Question Answering
— Unverified 00 Gradient Adversarial Training of Neural Networks Jun 21, 2018 BIG-bench Machine Learning Binary Classification
— Unverified 00 Gradient-Guided Knowledge Distillation for Object Detectors Mar 7, 2023 Knowledge Distillation Object
— Unverified 00 Gradient Reweighting: Towards Imbalanced Class-Incremental Learning Feb 28, 2024 class-incremental learning Class Incremental Learning
— Unverified 00 Granite Embedding Models Feb 27, 2025 Information Retrieval Knowledge Distillation
— Unverified 00 Graph-Adaptive Pruning for Efficient Inference of Convolutional Neural Networks Nov 21, 2018 Knowledge Distillation Model Compression
— Unverified 00 Graph-Based Cross-Domain Knowledge Distillation for Cross-Dataset Text-to-Image Person Retrieval Jan 25, 2025 Domain Adaptation Knowledge Distillation
— Unverified 00 Graph Representation Learning via Multi-task Knowledge Distillation Nov 11, 2019 Graph Representation Learning Knowledge Distillation
— Unverified 00 AKE-GNN: Effective Graph Learning with Adaptive Knowledge Exchange Jun 10, 2021 Classification Graph Classification
— Unverified 00 GripRank: Bridging the Gap between Retrieval and Generation via the Generative Knowledge Improved Passage Ranking May 29, 2023 Answer Generation Dialogue Generation
— Unverified 00 Ground-V: Teaching VLMs to Ground Complex Instructions in Pixels May 20, 2025 Instruction Following Knowledge Distillation
— Unverified 00 Group channel pruning and spatial attention distilling for object detection Jun 2, 2023 Knowledge Distillation Model Compression
— Unverified 00 Group Distributionally Robust Knowledge Distillation Nov 1, 2023 Knowledge Distillation
— Unverified 00 Grouped Knowledge Distillation for Deep Face Recognition Apr 10, 2023 Face Recognition Knowledge Distillation
— Unverified 00 Group-Mix SAM: Lightweight Solution for Industrial Assembly Line Applications Mar 15, 2024 Knowledge Distillation
— Unverified 00 Growing Deep Neural Network Considering with Similarity between Neurons Aug 23, 2024 Decision Making Knowledge Distillation
— Unverified 00 GTCOM Neural Machine Translation Systems for WMT19 Aug 1, 2019 Knowledge Distillation Language Modeling
— Unverified 00 Guided Deep Metric Learning Jun 4, 2022 Few-Shot Learning Knowledge Distillation
— Unverified 00 Guiding CTC Posterior Spike Timings for Improved Posterior Fusion and Knowledge Distillation Apr 17, 2019 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 Guiding Teacher Forcing with Seer Forcing for Neural Machine Translation Jun 12, 2021 Decoder Knowledge Distillation
— Unverified 00 GVP: Generative Volumetric Primitives Mar 31, 2023 Image Generation Knowledge Distillation
— Unverified 00 Handling Long-tailed Feature Distribution in AdderNets Dec 1, 2021 Knowledge Distillation
— Unverified 00 Hands-on Guidance for Distilling Object Detectors Mar 26, 2021 Knowledge Distillation Object
— Unverified 00 HanjaBridge: Resolving Semantic Ambiguity in Korean LLMs via Hanja-Augmented Pre-Training Jul 15, 2025 Cross-Lingual Transfer Knowledge Distillation
— Unverified 00 Hard Gate Knowledge Distillation -- Leverage Calibration for Robust and Reliable Language Model Oct 22, 2022 Knowledge Distillation Language Modeling
— Unverified 00 HARD: Hard Augmentations for Robust Distillation May 24, 2023 Data Augmentation Domain Generalization
— Unverified 00 Harmonizing knowledge Transfer in Neural Network with Unified Distillation Sep 27, 2024 Knowledge Distillation Transfer Learning
— Unverified 00 Harnessing Increased Client Participation with Cohort-Parallel Federated Learning May 24, 2024 Federated Learning image-classification
— Unverified 00 Spectral Maps for Learning on Subgraphs May 30, 2022 Graph Learning Knowledge Distillation
— Unverified 00 hdl2v: A Code Translation Dataset for Enhanced LLM Verilog Generation Jun 5, 2025 Code Generation Code Translation
— Unverified 00 Headache to Overstock? Promoting Long-tail Items through Debiased Product Bundling Nov 28, 2024 Knowledge Distillation Navigate
— Unverified 00 Head-Tail-Aware KL Divergence in Knowledge Distillation for Spiking Neural Networks Apr 29, 2025 Knowledge Distillation Transfer Learning
— Unverified 00 Hearing Lips: Improving Lip Reading by Distilling Speech Recognizers Nov 26, 2019 Knowledge Distillation Lipreading
— Unverified 00 HEAT: Hardware-Efficient Automatic Tensor Decomposition for Transformer Compression Nov 30, 2022 Efficient Exploration Knowledge Distillation
— Unverified 00 HeteFedRec: Federated Recommender Systems with Model Heterogeneity Jul 24, 2023 Knowledge Distillation model
— Unverified 00 Heterogeneity-aware Personalized Federated Learning via Adaptive Dual-Agent Reinforcement Learning Jan 28, 2025 Federated Learning Knowledge Distillation
— Unverified 00 Heterogeneous-Branch Collaborative Learning for Dialogue Generation Mar 21, 2023 Attribute Dialogue Generation
— Unverified 00 Heterogeneous Continual Learning Jun 14, 2023 Continual Learning Knowledge Distillation
— Unverified 00 Heterogeneous Federated Knowledge Graph Embedding Learning and Unlearning Feb 4, 2023 Federated Learning Graph Embedding
— Unverified 00 Heterogeneous Federated Learning Using Knowledge Codistillation Oct 4, 2023 Federated Learning image-classification
— Unverified 00 Heterogeneous Generative Knowledge Distillation with Masked Image Modeling Sep 18, 2023 image-classification Image Classification
— Unverified 00 HFedCKD: Toward Robust Heterogeneous Federated Learning via Data-free Knowledge Distillation and Two-way Contrast Mar 9, 2025 Data-free Knowledge Distillation Federated Learning
— Unverified 00 Hierarchical Knowledge Distillation for Dialogue Sequence Labeling Nov 22, 2021 Knowledge Distillation Scene Segmentation
— Unverified 00 Hierarchical Knowledge Distillation on Text Graph for Data-limited Attribute Inference Jan 10, 2024 Attribute Few-Shot Learning
— Unverified 00 Hierarchical Selective Classification May 19, 2024 Classification Knowledge Distillation
— Unverified 00 Hierarchical Transformer-based Large-Context End-to-end ASR with Large-Context Knowledge Distillation Feb 16, 2021 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 High-dimensional Analysis of Knowledge Distillation: Weak-to-Strong Generalization and Scaling Laws Oct 24, 2024 Knowledge Distillation regression
— Unverified 00 High-Fidelity Pseudo-label Generation by Large Language Models for Training Robust Radiology Report Classifiers May 3, 2025 Diagnostic Knowledge Distillation
— Unverified 00 Highly Constrained Coded Aperture Imaging Systems Design Via a Knowledge Distillation Approach Jun 25, 2024 Image Reconstruction Knowledge Distillation
— Unverified 00