Just KIDDIN: Knowledge Infusion and Distillation for Detection of INdecent Memes Nov 19, 2024 Knowledge Distillation Knowledge Graphs
— Unverified 00 K-AID: Enhancing Pre-trained Language Models with Domain Knowledge for Question Answering Sep 22, 2021 CPU Knowledge Distillation
— Unverified 00 KAT-V1: Kwai-AutoThink Technical Report Jul 11, 2025 Knowledge Distillation Large Language Model
— Unverified 00 KD^2M: An unifying framework for feature knowledge distillation Apr 2, 2025 Knowledge Distillation
— Unverified 00 KDC-MAE: Knowledge Distilled Contrastive Mask Auto-Encoder Nov 19, 2024 Contrastive Learning Knowledge Distillation
— Unverified 00 KDCTime: Knowledge Distillation with Calibration on InceptionTime for Time-series Classification Dec 4, 2021 Knowledge Distillation Time Series
— Unverified 00 KD-DETR: Knowledge Distillation for Detection Transformer with Consistent Distillation Points Sampling Jan 1, 2024 General Knowledge Knowledge Distillation
— Unverified 00 KD-DLGAN: Data Limited Image Generation via Knowledge Distillation Mar 30, 2023 Diversity Image Generation
— Unverified 00 KDExplainer: A Task-oriented Attention Model for Explaining Knowledge Distillation May 10, 2021 Knowledge Distillation Mixture-of-Experts
— Unverified 00 KD-FixMatch: Knowledge Distillation Siamese Neural Networks Sep 11, 2023 Knowledge Distillation
— Unverified 00 KDGAN: Knowledge Distillation with Generative Adversarial Networks Dec 1, 2018 Knowledge Distillation Multi-Label Learning
— Unverified 00 KDH-MLTC: Knowledge Distillation for Healthcare Multi-Label Text Classification May 12, 2025 Classification Hyperparameter Optimization
— Unverified 00 KDk: A Defense Mechanism Against Label Inference Attacks in Vertical Federated Learning Apr 18, 2024 Federated Learning Knowledge Distillation
— Unverified 00 KDLSQ-BERT: A Quantized Bert Combining Knowledge Distillation with Learned Step Size Quantization Jan 15, 2021 Knowledge Distillation Language Modelling
— Unverified 00 KDRL: Post-Training Reasoning LLMs via Unified Knowledge Distillation and Reinforcement Learning Jun 2, 2025 Knowledge Distillation Large Language Model
— Unverified 00 KDSM: An uplift modeling framework based on knowledge distillation and sample matching Mar 6, 2023 counterfactual Knowledge Distillation
— Unverified 00 KDSTM: Neural Semi-supervised Topic Modeling with Knowledge Distillation Jul 4, 2023 Classification Knowledge Distillation
— Unverified 00 KD-VLP: Improving End-to-End Vision-and-Language Pretraining with Object Knowledge Distillation Jan 16, 2022 cross-modal alignment Knowledge Distillation
— Unverified 00 Keep Decoding Parallel with Effective Knowledge Distillation from Language Models to End-to-end Speech Recognisers Jan 22, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 Kendall's τ Coefficient for Logits Distillation Sep 26, 2024 Knowledge Distillation
— Unverified 00 Kernel Based Progressive Distillation for Adder Neural Networks Sep 28, 2020 Knowledge Distillation
— Unverified 00 Kernel Methods in Hyperbolic Spaces Jan 1, 2021 Few-Shot Learning image-classification
— Unverified 00 KEYword based Sampling (KEYS) for Large Language Models May 30, 2023 Knowledge Distillation Language Modeling
— Unverified 00 KGEx: Explaining Knowledge Graph Embeddings via Subgraph Sampling and Knowledge Distillation Oct 2, 2023 Knowledge Distillation Knowledge Graph Embeddings
— Unverified 00 Enhancing CLIP Conceptual Embedding through Knowledge Distillation Dec 4, 2024 Contrastive Learning Knowledge Distillation
— Unverified 00 KnFu: Effective Knowledge Fusion Mar 18, 2024 Federated Learning Knowledge Distillation
— Unverified 00 KNIFE: Distilling Reasoning Knowledge From Free-Text Rationales Dec 19, 2022 Knowledge Distillation Language Modelling
— Unverified 00 Knowledge Adaptation for Efficient Semantic Segmentation Mar 12, 2019 Knowledge Distillation Segmentation
— Unverified 00 Knowledge Adaptation: Teaching to Adapt Feb 7, 2017 Domain Adaptation Knowledge Distillation
— Unverified 00 Knowledge as Priors: Cross-Modal Knowledge Generalization for Datasets without Superior Knowledge Apr 1, 2020 3D Hand Pose Estimation Hand Pose Estimation
— Unverified 00 Knowledge Concentration: Learning 100K Object Classifiers in a Single CNN Nov 21, 2017 General Classification image-classification
— Unverified 00 Knowledge Cross-Distillation for Membership Privacy Nov 2, 2021 Inference Attack Knowledge Distillation
— Unverified 00 Knowledge Distillation and Data Selection for Semi-Supervised Learning in CTC Acoustic Models Aug 10, 2020 Knowledge Distillation speech-recognition
— Unverified 00 Knowledge Distillation and Dataset Distillation of Large Language Models: Emerging Trends, Challenges, and Future Directions Apr 20, 2025 Dataset Distillation Diversity
— Unverified 00 Knowledge Distillation and Enhanced Subdomain Adaptation Using Graph Convolutional Network for Resource-Constrained Bearing Fault Diagnosis Jan 13, 2025 Diagnostic Fault Diagnosis
— Unverified 00 Knowledge Distillation Applied to Optical Channel Equalization: Solving the Parallelization Problem of Recurrent Connection Dec 8, 2022 Knowledge Distillation
— Unverified 00 Knowledge Distillation Label Smoothing: Fact or Fallacy? Jan 30, 2023 Knowledge Distillation text-classification
— Unverified 00 Knowledge Distillation as Self-Supervised Learning Jan 17, 2022 Knowledge Distillation Self-Supervised Learning
— Unverified 00 Knowledge Distillation: A Survey Jun 9, 2020 Knowledge Distillation Model Compression
— Unverified 00 Knowledge Distillation: Bad Models Can Be Good Role Models Mar 28, 2022 Knowledge Distillation Learning Theory
— Unverified 00 Knowledge Distillation based Contextual Relevance Matching for E-commerce Product Search Oct 4, 2022 Knowledge Distillation
— Unverified 00 Knowledge Distillation based Ensemble Learning for Neural Machine Translation Jan 1, 2021 Ensemble Learning Knowledge Distillation
— Unverified 00 Knowledge Distillation-based Information Sharing for Online Process Monitoring in Decentralized Manufacturing System Feb 8, 2023 Knowledge Distillation
— Unverified 00 Knowledge Distillation Based Semantic Communications For Multiple Users Nov 23, 2023 Decoder Knowledge Distillation
— Unverified 00 Knowledge Distillation Beyond Model Compression Jul 3, 2020 Knowledge Distillation model
— Unverified 00 Knowledge Distillation Circumvents Nonlinearity for Optical Convolutional Neural Networks Feb 26, 2021 Computational Efficiency Knowledge Distillation
— Unverified 00 Knowledge Distillation-Empowered Digital Twin for Anomaly Detection Sep 8, 2023 Anomaly Detection Knowledge Distillation
— Unverified 00 Knowledge Distillation for 6D Pose Estimation by Aligning Distributions of Local Predictions May 30, 2022 6D Pose Estimation 6D Pose Estimation using RGB
— Unverified 00 Knowledge Distillation for Action Anticipation via Label Smoothing Apr 16, 2020 Action Anticipation Autonomous Driving
— Unverified 00 Knowledge Distillation for Adaptive MRI Prostate Segmentation Based on Limit-Trained Multi-Teacher Models Mar 16, 2023 Knowledge Distillation MRI segmentation
— Unverified 00