Towards Lifelong Few-Shot Customization of Text-to-Image Diffusion Nov 8, 2024 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 0Towards LogiGLUE: A Brief Survey and A Benchmark for Analyzing Logical Reasoning Capabilities of Language Models Oct 2, 2023 Knowledge Distillation Language Modelling
— Unverified 0Towards Long-Tailed Recognition for Graph Classification via Collaborative Experts Aug 31, 2023 Contrastive Learning Graph Classification
— Unverified 0Towards Making Deep Transfer Learning Never Hurt Nov 18, 2019 All Knowledge Distillation
— Unverified 0Towards Model Agnostic Federated Learning Using Knowledge Distillation Oct 28, 2021 Federated Learning Knowledge Distillation
— Unverified 0Towards Non-task-specific Distillation of BERT via Sentence Representation Approximation Apr 7, 2020 Knowledge Distillation Sentence
— Unverified 0Towards On-Board Panoptic Segmentation of Multispectral Satellite Images Apr 5, 2022 Knowledge Distillation Panoptic Segmentation
— Unverified 0Towards Optimal Trade-offs in Knowledge Distillation for CNNs and Vision Transformers at the Edge Jun 25, 2024 Knowledge Distillation
— Unverified 0Towards Oracle Knowledge Distillation with Neural Architecture Search Nov 29, 2019 image-classification Image Classification
— Unverified 0Towards Personalized Federated Learning via Comprehensive Knowledge Distillation Nov 6, 2024 Federated Learning Knowledge Distillation
— Unverified 0Towards Robust Classification with Image Quality Assessment Apr 14, 2020 Classification General Classification
— Unverified 0Towards Satellite Non-IID Imagery: A Spectral Clustering-Assisted Federated Learning Approach Oct 17, 2024 Earth Observation Federated Learning
— Unverified 0Towards Scalable and Generalizable Earth Observation Data Mining via Foundation Model Composition Jun 25, 2025 Earth Observation Knowledge Distillation
— Unverified 0Towards Scalable & Efficient Interaction-Aware Planning in Autonomous Vehicles using Knowledge Distillation Apr 2, 2024 Autonomous Vehicles Decision Making
— Unverified 0Towards Streaming Egocentric Action Anticipation Oct 11, 2021 Action Anticipation Knowledge Distillation
— Unverified 0SOCRATES: Text-based Human Search and Approach using a Robot Dog Feb 10, 2023 Knowledge Distillation
— Unverified 0Towards Unconstrained 2D Pose Estimation of the Human Spine Apr 10, 2025 2D Pose Estimation Active Learning
— Unverified 0Towards Understanding Ensemble, Knowledge Distillation and Self-Distillation in Deep Learning Dec 17, 2020 Deep Learning Knowledge Distillation
— Unverified 0Towards Understanding Knowledge Distillation May 27, 2021 Knowledge Distillation Transfer Learning
— Unverified 0Do we need Label Regularization to Fine-tune Pre-trained Language Models? May 25, 2022 Knowledge Distillation Model Compression
— Unverified 0Towards Unsupervised Crowd Counting via Regression-Detection Bi-knowledge Transfer Aug 12, 2020 Crowd Counting Knowledge Distillation
— Unverified 0Towards Vector Optimization on Low-Dimensional Vector Symbolic Architecture Feb 19, 2025 Knowledge Distillation
— Unverified 0Towards Zero-Shot Knowledge Distillation for Natural Language Processing Dec 31, 2020 Knowledge Distillation Model Compression
— Unverified 0Toxicity Detection can be Sensitive to the Conversational Context Nov 19, 2021 Data Augmentation Knowledge Distillation
— Unverified 0Trace-of-Thought Prompting: Investigating Prompt-Based Knowledge Distillation Through Question Decomposition Apr 29, 2025 GSM8K Knowledge Distillation
— Unverified 0Training an LLM-as-a-Judge Model: Pipeline, Insights, and Practical Lessons Feb 5, 2025 Instruction Following Knowledge Distillation
— Unverified 0Training Domain Draft Models for Speculative Decoding: Best Practices and Insights Mar 10, 2025 Knowledge Distillation
— Unverified 0Training Self-localization Models for Unseen Unfamiliar Places via Teacher-to-Student Data-Free Knowledge Transfer Mar 13, 2024 Continual Learning Image Retrieval
— Unverified 0Training Shallow and Thin Networks for Acceleration via Knowledge Distillation with Conditional Adversarial Networks Sep 2, 2017 General Classification Knowledge Distillation
— Unverified 0Adversarial Speaker Distillation for Countermeasure Model on Automatic Speaker Verification Mar 31, 2022 Knowledge Distillation Speaker Verification
— Unverified 0TransFair: Transferring Fairness from Ocular Disease Classification to Progression Prediction Nov 24, 2024 Classification Fairness
— Unverified 0Transferable Deployment of Semantic Edge Inference Systems via Unsupervised Domain Adaption Apr 16, 2025 Decoder Domain Adaptation
— Unverified 0Transfer Learning with Pre-trained Conditional Generative Models Apr 27, 2022 Knowledge Distillation Transfer Learning
— Unverified 0Transferring Knowledge from Structure-aware Self-attention Language Model to Sequence-to-Sequence Semantic Parsing Jan 16, 2022 Code Generation Knowledge Distillation
— Unverified 0Transferring Knowledge from Structure-aware Self-attention Language Model to Sequence-to-Sequence Semantic Parsing Oct 1, 2022 Code Generation Knowledge Distillation
— Unverified 0Transferring Learning Trajectories of Neural Networks May 23, 2023 Knowledge Distillation
— Unverified 0Transformer-based ASR Incorporating Time-reduction Layer and Fine-tuning with Self-Knowledge Distillation Mar 17, 2021 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Transformer-Based Fault-Tolerant Control for Fixed-Wing UAVs Using Knowledge Distillation and In-Context Adaptation Nov 5, 2024 Fault Detection In-Context Learning
— Unverified 0Transforming In-Vehicle Network Intrusion Detection: VAE-based Knowledge Distillation Meets Explainable AI Oct 11, 2024 Autonomous Vehicles Intrusion Detection
— Unverified 0TransformMix: Learning Transformation and Mixing Strategies from Data Mar 19, 2024 Data Augmentation Knowledge Distillation
— Unverified 0Translate-Distill: Learning Cross-Language Dense Retrieval by Translation and Distillation Jan 9, 2024 Information Retrieval Knowledge Distillation
— Unverified 0Tree Knowledge Distillation for Compressing Transformer-Based Language Models Jan 16, 2022 Knowledge Distillation
— Unverified 0Tree-Like Decision Distillation Jun 19, 2021 Decision Making Knowledge Distillation
— Unverified 0TriDeNT: Triple Deep Network Training for Privileged Knowledge Distillation in Histopathology Dec 4, 2023 Knowledge Distillation
— Unverified 0Trigger is Not Sufficient: Exploiting Frame-aware Knowledge for Implicit Event Argument Extraction Aug 1, 2021 Event Argument Extraction Knowledge Distillation
— Unverified 0TRILLsson: Distilled Universal Paralinguistic Speech Representations Mar 1, 2022 Emotion Recognition Knowledge Distillation
— Unverified 0Tripartite Weight-Space Ensemble for Few-Shot Class-Incremental Learning Jan 1, 2025 class-incremental learning Class Incremental Learning
— Unverified 0TripLe: Revisiting Pretrained Model Reuse and Progressive Learning for Efficient Vision Transformer Scaling and Searching Jan 1, 2023 Knowledge Distillation Neural Architecture Search
— Unverified 0Triplet Knowledge Distillation May 25, 2023 Face Recognition image-classification
— Unverified 0Triple-View Knowledge Distillation for Semi-Supervised Semantic Segmentation Sep 22, 2023 Decoder Feature Importance
— Unverified 0