Knowledge Distillation of Domain-adapted LLMs for Question-Answering in Telecom Apr 28, 2025 Domain Adaptation Knowledge Distillation
— Unverified 00 Knowledge Distillation of LLM for Automatic Scoring of Science Education Assessments Dec 26, 2023 Knowledge Distillation Mathematical Reasoning
— Unverified 00 Knowledge Distillation of Transformer-based Language Models Revisited Jun 29, 2022 GPU Knowledge Distillation
— Unverified 00 Knowledge Distillation on Graphs: A Survey Feb 1, 2023 Knowledge Distillation Model Compression
— Unverified 00 Knowledge Distillation on Spatial-Temporal Graph Convolutional Network for Traffic Prediction Jan 22, 2024 Graph Neural Network Knowledge Distillation
— Unverified 00 Knowledge Distillation to Ensemble Global and Interpretable Prototype-Based Mammogram Classification Models Sep 26, 2022 Diversity Knowledge Distillation
— Unverified 00 Knowledge Distillation Transfer Sets and their Impact on Downstream NLU Tasks Oct 10, 2022 domain classification intent-classification
— Unverified 00 Knowledge Distillation Under Ideal Joint Classifier Assumption Apr 19, 2023 Domain Adaptation Knowledge Distillation
— Unverified 00 Knowledge Distillation Using Frontier Open-source LLMs: Generalizability and the Role of Synthetic Data Oct 24, 2024 Knowledge Distillation Natural Language Understanding
— Unverified 00 Knowledge distillation using unlabeled mismatched images Mar 21, 2017 General Classification image-classification
— Unverified 00 Knowledge distillation via adaptive instance normalization Mar 9, 2020 Knowledge Distillation Model Compression
— Unverified 00 Knowledge Distillation via Instance-level Sequence Learning Jun 21, 2021 General Knowledge Knowledge Distillation
— Unverified 00 Knowledge Distillation via Query Selection for Detection Transformer Sep 10, 2024 Knowledge Distillation object-detection
— Unverified 00 Knowledge distillation via softmax regression representation learning Jan 1, 2021 Knowledge Distillation Model Compression
— Unverified 00 Knowledge Distillation via Token-level Relationship Graph Jun 20, 2023 Knowledge Distillation Transfer Learning
— Unverified 00 Knowledge Distillation via Weighted Ensemble of Teaching Assistants Jun 23, 2022 Ensemble Learning Knowledge Distillation
— Unverified 00 Knowledge Distillation vs. Pretraining from Scratch under a Fixed (Computation) Budget Apr 30, 2024 Knowledge Distillation Language Modeling
— Unverified 00 Knowledge distillation with a class-aware loss for endoscopic disease detection Jul 19, 2022 Diagnostic Knowledge Distillation
— Unverified 00 Knowledge Distillation with Adapted Weight Jan 6, 2025 4k Fairness
— Unverified 00 Knowledge Distillation with Adaptive Asymmetric Label Sharpening for Semi-supervised Fracture Detection in Chest X-rays Dec 30, 2020 Fracture detection Knowledge Distillation
— Unverified 00 Knowledge Distillation with BERT for Image Tag-Based Privacy Prediction Sep 1, 2021 Knowledge Distillation TAG
— Unverified 00 Knowledge distillation with error-correcting transfer learning for wind power prediction Apr 1, 2022 Knowledge Distillation Transfer Learning
— Unverified 00 Knowledge Distillation with Feature Maps for Image Classification Dec 3, 2018 Classification General Classification
— Unverified 00 Knowledge Distillation with Multi-granularity Mixture of Priors for Image Super-Resolution Apr 3, 2024 Image Super-Resolution Knowledge Distillation
— Unverified 00 Knowledge Distillation with Noisy Labels for Natural Language Understanding Sep 21, 2021 Knowledge Distillation Natural Language Understanding
— Unverified 00 Representative Teacher Keys for Knowledge Distillation Model Compression Based on Attention Mechanism for Image Classification Jun 26, 2022 GPU image-classification
— Unverified 00 Knowledge distillation with Segment Anything (SAM) model for Planetary Geological Mapping May 12, 2023 Decoder Image Segmentation
— Unverified 00 Knowledge Distillation with Training Wheels Feb 24, 2025 Knowledge Distillation Language Modeling
— Unverified 00 Knowledge-Distilled Graph Neural Networks for Personalized Epileptic Seizure Detection Apr 3, 2023 channel selection EEG
— Unverified 00 EA-KD: Entropy-based Adaptive Knowledge Distillation Nov 22, 2023 image-classification Image Classification
— Unverified 00 Knowledge Consistency between Neural Networks and Beyond Aug 5, 2019 Knowledge Distillation
— Unverified 00 Knowledge Migration Framework for Smart Contract Vulnerability Detection Dec 15, 2024 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 00 Knowledge Representing: Efficient, Sparse Representation of Prior Knowledge for Knowledge Distillation Nov 13, 2019 Image Classification Knowledge Distillation
— Unverified 00 Knowledge-Spreader: Learning Semi-Supervised Facial Action Dynamics by Consistifying Knowledge Granularity Jan 1, 2023 Knowledge Distillation
— Unverified 00 Knowledge Squeezed Adversarial Network Compression Apr 10, 2019 Knowledge Distillation Transfer Learning
— Unverified 00 Knowledge Transfer with Visual Prompt in multi-modal Dialogue Understanding and Generation Oct 1, 2022 Dialogue Understanding Knowledge Distillation
— Unverified 00 KnowRU: Knowledge Reusing via Knowledge Distillation in Multi-agent Reinforcement Learning Mar 27, 2021 Deep Reinforcement Learning Knowledge Distillation
— Unverified 00 KnowSR: Knowledge Sharing among Homogeneous Agents in Multi-agent Reinforcement Learning May 25, 2021 Deep Reinforcement Learning Knowledge Distillation
— Unverified 00 Know your tools well: Better and faster QA with synthetic examples Oct 16, 2021 Diversity Knowledge Distillation
— Unverified 00 KOALA: Empirical Lessons Toward Memory-Efficient and Fast Diffusion Models for Text-to-Image Synthesis Dec 7, 2023 Denoising Image Generation
— Unverified 00 KoGNER: A Novel Framework for Knowledge Graph Distillation on Biomedical Named Entity Recognition Mar 19, 2025 Knowledge Distillation Knowledge Graphs
— Unverified 00 KroneckerBERT: Learning Kronecker Decomposition for Pre-trained Language Models via Knowledge Distillation Sep 13, 2021 Knowledge Distillation Language Modeling
— Unverified 00 KroneckerBERT: Significant Compression of Pre-trained Language Models Through Kronecker Decomposition and Knowledge Distillation Jul 1, 2022 Knowledge Distillation Language Modeling
— Unverified 00 Kronecker Decomposition for GPT Compression Oct 15, 2021 Knowledge Distillation Language Modeling
— Unverified 00 KTAN: Knowledge Transfer Adversarial Network Oct 18, 2018 image-classification Image Classification
— Unverified 00 Label Assignment Distillation for Object Detection Sep 16, 2021 Knowledge Distillation Object
— Unverified 00 Label Augmentation via Time-based Knowledge Distillation for Financial Anomaly Detection Jan 5, 2021 Anomaly Detection Knowledge Distillation
— Unverified 00 Label-Context-Dependent Internal Language Model Estimation for CTC Jun 6, 2025 Knowledge Distillation Language Modeling
— Unverified 00 Label Denoising with Large Ensembles of Heterogeneous Neural Networks Sep 12, 2018 Data Augmentation Denoising
— Unverified 00 Label driven Knowledge Distillation for Federated Learning with non-IID Data Sep 29, 2022 Federated Learning Knowledge Distillation
— Unverified 00