Adapting While Learning: Grounding LLMs for Scientific Problems with Intelligent Tool Usage Adaptation Nov 1, 2024 Epidemiology Knowledge Distillation
— Unverified 00 Adaptive Affinity-Based Generalization For MRI Imaging Segmentation Across Resource-Limited Settings Apr 3, 2024 Data Integration Knowledge Distillation
— Unverified 00 Adaptive Beam Search to Enhance On-device Abstractive Summarization Dec 22, 2021 Abstractive Text Summarization Knowledge Distillation
— Unverified 00 Adaptive Deep Iris Feature Extractor at Arbitrary Resolutions Jul 11, 2024 Iris Recognition Knowledge Distillation
— Unverified 00 Adaptive Explicit Knowledge Transfer for Knowledge Distillation Sep 3, 2024 Knowledge Distillation Transfer Learning
— Unverified 00 Adaptive Group Robust Ensemble Knowledge Distillation Nov 22, 2024 Knowledge Distillation
— Unverified 00 Adaptive Instance Distillation for Object Detection in Autonomous Driving Jan 26, 2022 Autonomous Driving Knowledge Distillation
— Unverified 00 Adaptive Knowledge Distillation between Text and Speech Pre-trained Models Mar 7, 2023 Knowledge Distillation Spoken Language Understanding
— Unverified 00 Adaptive Knowledge Distillation for Classification of Hand Images using Explainable Vision Transformers Aug 20, 2024 Knowledge Distillation
— Unverified 00 Adaptive Label Smoothing with Self-Knowledge Sep 29, 2021 Knowledge Distillation Machine Translation
— Unverified 00 Adaptive Label Smoothing with Self-Knowledge in Natural Language Generation Oct 22, 2022 Knowledge Distillation Text Generation
— Unverified 00 Adaptively Integrated Knowledge Distillation and Prediction Uncertainty for Continual Learning Jan 18, 2023 Continual Learning Knowledge Distillation
— Unverified 00 Adaptive Multiplane Image Generation from a Single Internet Picture Nov 26, 2020 Depth Estimation Image Generation
— Unverified 00 Adaptive Regularization of Labels Aug 15, 2019 Data Augmentation Knowledge Distillation
— Unverified 00 Add a SideNet to your MainNet Jul 14, 2020 General Classification Knowledge Distillation
— Unverified 00 Addressing Bias Through Ensemble Learning and Regularized Fine-Tuning Feb 1, 2024 Ensemble Learning Knowledge Distillation
— Unverified 00 A Deep Hierarchical Feature Sparse Framework for Occluded Person Re-Identification Jan 15, 2024 Data Augmentation Knowledge Distillation
— Unverified 00 A deep Natural Language Inference predictor without language-specific training data Sep 6, 2023 Aspect-Based Sentiment Analysis Knowledge Distillation
— Unverified 00 A Deep Reinforcement Learning Framework for Rapid Diagnosis of Whole Slide Pathological Images May 5, 2022 Deep Reinforcement Learning Knowledge Distillation
— Unverified 00 A Dimensional Structure based Knowledge Distillation Method for Cross-Modal Learning Jun 28, 2023 Knowledge Distillation
— Unverified 00 ADINet: Attribute Driven Incremental Network for Retinal Image Classification Jun 1, 2020 Attribute Classification
— Unverified 00 A distillation based approach for the diagnosis of diseases Aug 7, 2021 Knowledge Distillation
— Unverified 00 ADMP: An Adversarial Double Masks Based Pruning Framework For Unsupervised Cross-Domain Compression Jun 7, 2020 Domain Adaptation Knowledge Distillation
— Unverified 00 ADROIT: A Self-Supervised Framework for Learning Robust Representations for Active Learning Mar 10, 2025 Active Learning Knowledge Distillation
— Unverified 00 ADU: Adaptive Detection of Unknown Categories in Black-Box Domain Adaptation Jan 1, 2025 Domain Adaptation Knowledge Distillation
— Unverified 00 ADU-Depth: Attention-based Distillation with Uncertainty Modeling for Depth Estimation Sep 26, 2023 3D geometry Depth Estimation
— Unverified 00 Advancing Deep Learning through Probability Engineering: A Pragmatic Paradigm for Modern AI Mar 19, 2025 Deep Learning Federated Learning
— Unverified 00 Advancing Medical Radiograph Representation Learning: A Hybrid Pre-training Paradigm with Multilevel Semantic Granularity Oct 1, 2024 Decoder Knowledge Distillation
— Unverified 00 Adversarial-Based Knowledge Distillation for Multi-Model Ensemble and Noisy Data Refinement Aug 22, 2019 Knowledge Distillation Missing Labels
— Unverified 00 Adversarial Curriculum Graph-Free Knowledge Distillation for Graph Neural Networks Apr 1, 2025 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 00 Adversarial Feature Alignment: Avoid Catastrophic Forgetting in Incremental Task Lifelong Learning Oct 24, 2019 Continual Learning image-classification
— Unverified 00 Adversarial Finetuning with Latent Representation Constraint to Mitigate Accuracy-Robustness Tradeoff Aug 31, 2023 Knowledge Distillation
— Unverified 00 Adversarially Robust and Explainable Model Compression with On-Device Personalization for Text Classification Jan 10, 2021 Adversarial Robustness General Classification
— Unverified 00 Adversarial Prompt Distillation for Vision-Language Models Nov 22, 2024 Adversarial Robustness Autonomous Driving
— Unverified 00 Adversarial Robustness of Distilled and Pruned Deep Learning-based Wireless Classifiers Apr 11, 2024 Adversarial Robustness Knowledge Distillation
— Unverified 00 Adversarial Self-Supervised Data-Free Distillation for Text Classification Oct 10, 2020 Classification General Classification
— Unverified 00 Adversarial Sparse Teacher: Defense Against Distillation-Based Model Stealing Attacks Using Adversarial Examples Mar 8, 2024 Knowledge Distillation
— Unverified 00 Adverse Weather Optical Flow: Cumulative Homogeneous-Heterogeneous Adaptation Sep 25, 2024 Domain Adaptation Knowledge Distillation
— Unverified 00 A dynamic interactive learning framework for automated 3D medical image segmentation Dec 11, 2023 Image Registration Image Segmentation
— Unverified 00 A Flexible Multi-Task Model for BERT Serving Nov 16, 2021 Knowledge Distillation model
— Unverified 00 Discovery of novel antimicrobial peptides with notable antibacterial potency by a LLM-based foundation model Jul 17, 2024 Knowledge Distillation scientific discovery
— Unverified 00 A Framework for Double-Blind Federated Adaptation of Foundation Models Feb 3, 2025 Federated Learning image-classification
— Unverified 00 AfroXLMR-Comet: Multilingual Knowledge Distillation with Attention Matching for Low-Resource languages Feb 25, 2025 Knowledge Distillation Language Modeling
— Unverified 00 After-Stroke Arm Paresis Detection using Kinematic Data Nov 3, 2023 Action Classification Knowledge Distillation
— Unverified 00 A Generalized and Robust Method Towards Practical Gaze Estimation on Smart Phone Oct 16, 2019 Gaze Estimation Knowledge Distillation
— Unverified 00 Generalized Supervised Contrastive Learning Jun 1, 2022 Contrastive Learning Knowledge Distillation
— Unverified 00 A General Multiple Data Augmentation Based Framework for Training Deep Neural Networks May 29, 2022 Data Augmentation image-classification
— Unverified 00 A Generative Framework for Personalized Learning and Estimation: Theory, Algorithms, and Privacy Jul 5, 2022 Federated Learning Knowledge Distillation
— Unverified 00 AgentDistill: Training-Free Agent Distillation with Generalizable MCP Boxes Jun 17, 2025 Knowledge Distillation Transfer Learning
— Unverified 00 Agglomerating Large Vision Encoders via Distillation for VFSS Segmentation Apr 3, 2025 Image Segmentation Knowledge Distillation
— Unverified 00