Trace-of-Thought Prompting: Investigating Prompt-Based Knowledge Distillation Through Question Decomposition Apr 29, 2025 GSM8K Knowledge Distillation
— Unverified 00 Training an LLM-as-a-Judge Model: Pipeline, Insights, and Practical Lessons Feb 5, 2025 Instruction Following Knowledge Distillation
— Unverified 00 Training Domain Draft Models for Speculative Decoding: Best Practices and Insights Mar 10, 2025 Knowledge Distillation
— Unverified 00 Training Self-localization Models for Unseen Unfamiliar Places via Teacher-to-Student Data-Free Knowledge Transfer Mar 13, 2024 Continual Learning Image Retrieval
— Unverified 00 Training Shallow and Thin Networks for Acceleration via Knowledge Distillation with Conditional Adversarial Networks Sep 2, 2017 General Classification Knowledge Distillation
— Unverified 00 Adversarial Speaker Distillation for Countermeasure Model on Automatic Speaker Verification Mar 31, 2022 Knowledge Distillation Speaker Verification
— Unverified 00 TransFair: Transferring Fairness from Ocular Disease Classification to Progression Prediction Nov 24, 2024 Classification Fairness
— Unverified 00 Transferable Deployment of Semantic Edge Inference Systems via Unsupervised Domain Adaption Apr 16, 2025 Decoder Domain Adaptation
— Unverified 00 Transfer Learning with Pre-trained Conditional Generative Models Apr 27, 2022 Knowledge Distillation Transfer Learning
— Unverified 00 Transferring Knowledge from Structure-aware Self-attention Language Model to Sequence-to-Sequence Semantic Parsing Jan 16, 2022 Code Generation Knowledge Distillation
— Unverified 00 Transferring Knowledge from Structure-aware Self-attention Language Model to Sequence-to-Sequence Semantic Parsing Oct 1, 2022 Code Generation Knowledge Distillation
— Unverified 00 Transferring Learning Trajectories of Neural Networks May 23, 2023 Knowledge Distillation
— Unverified 00 Transformer-based ASR Incorporating Time-reduction Layer and Fine-tuning with Self-Knowledge Distillation Mar 17, 2021 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 Transformer-Based Fault-Tolerant Control for Fixed-Wing UAVs Using Knowledge Distillation and In-Context Adaptation Nov 5, 2024 Fault Detection In-Context Learning
— Unverified 00 Transforming In-Vehicle Network Intrusion Detection: VAE-based Knowledge Distillation Meets Explainable AI Oct 11, 2024 Autonomous Vehicles Intrusion Detection
— Unverified 00 TransformMix: Learning Transformation and Mixing Strategies from Data Mar 19, 2024 Data Augmentation Knowledge Distillation
— Unverified 00 Translate-Distill: Learning Cross-Language Dense Retrieval by Translation and Distillation Jan 9, 2024 Information Retrieval Knowledge Distillation
— Unverified 00 Tree Knowledge Distillation for Compressing Transformer-Based Language Models Jan 16, 2022 Knowledge Distillation
— Unverified 00 Tree-Like Decision Distillation Jun 19, 2021 Decision Making Knowledge Distillation
— Unverified 00 TriDeNT: Triple Deep Network Training for Privileged Knowledge Distillation in Histopathology Dec 4, 2023 Knowledge Distillation
— Unverified 00 Trigger is Not Sufficient: Exploiting Frame-aware Knowledge for Implicit Event Argument Extraction Aug 1, 2021 Event Argument Extraction Knowledge Distillation
— Unverified 00 TRILLsson: Distilled Universal Paralinguistic Speech Representations Mar 1, 2022 Emotion Recognition Knowledge Distillation
— Unverified 00 Tripartite Weight-Space Ensemble for Few-Shot Class-Incremental Learning Jan 1, 2025 class-incremental learning Class Incremental Learning
— Unverified 00 TripLe: Revisiting Pretrained Model Reuse and Progressive Learning for Efficient Vision Transformer Scaling and Searching Jan 1, 2023 Knowledge Distillation Neural Architecture Search
— Unverified 00 Triplet Knowledge Distillation May 25, 2023 Face Recognition image-classification
— Unverified 00 Triple-View Knowledge Distillation for Semi-Supervised Semantic Segmentation Sep 22, 2023 Decoder Feature Importance
— Unverified 00 TrustAL: Trustworthy Active Learning using Knowledge Distillation Jan 26, 2022 Active Learning Diversity
— Unverified 00 TSAK: Two-Stage Semantic-Aware Knowledge Distillation for Efficient Wearable Modality and Model Optimization in Manufacturing Lines Aug 26, 2024 Activity Recognition Human Activity Recognition
— Unverified 00 TS-HTFA: Advancing Time Series Forecasting via Hierarchical Text-Free Alignment with Large Language Models Sep 23, 2024 Contrastive Learning cross-modal alignment
— Unverified 00 TT-MPD: Test Time Model Pruning and Distillation Dec 10, 2024 Knowledge Distillation model
— Unverified 00 TTT-KD: Test-Time Training for 3D Semantic Segmentation through Knowledge Distillation from Foundation Models Mar 18, 2024 3D Semantic Segmentation Knowledge Distillation
— Unverified 00 Turbo2K: Towards Ultra-Efficient and High-Quality 2K Video Synthesis Apr 20, 2025 2k Knowledge Distillation
— Unverified 00 TutorNet: Towards Flexible Knowledge Distillation for End-to-End Speech Recognition Aug 3, 2020 Knowledge Distillation Model Compression
— Unverified 00 Twin Network Augmentation: A Novel Training Strategy for Improved Spiking Neural Networks and Efficient Weight Quantization Sep 24, 2024 Knowledge Distillation Quantization
— Unverified 00 Two-in-one Knowledge Distillation for Efficient Facial Forgery Detection Feb 21, 2023 Knowledge Distillation Vocal Bursts Valence Prediction
— Unverified 00 Two-Pass End-to-End ASR Model Compression Jan 8, 2022 Decoder Knowledge Distillation
— Unverified 00 Two-Stage Multi-task Self-Supervised Learning for Medical Image Segmentation Feb 11, 2024 Auxiliary Learning Image Segmentation
— Unverified 00 Two-Step Knowledge Distillation for Tiny Speech Enhancement Sep 15, 2023 Knowledge Distillation Model Compression
— Unverified 00 UB-FineNet: Urban Building Fine-grained Classification Network for Open-access Satellite Images Mar 4, 2024 Classification Denoising
— Unverified 00 Multi-trial Neural Architecture Search with Lottery Tickets Mar 8, 2022 Knowledge Distillation Neural Architecture Search
— Unverified 00 UIFormer: A Unified Transformer-based Framework for Incremental Few-Shot Object Detection and Instance Segmentation Nov 13, 2024 Decoder Few-Shot Object Detection
— Unverified 00 UKD: Debiasing Conversion Rate Estimation via Uncertainty-regularized Knowledge Distillation Jan 20, 2022 Knowledge Distillation Selection bias
— Unverified 00 U-Know-DiffPAN: An Uncertainty-aware Knowledge Distillation Diffusion Framework with Details Enhancement for PAN-Sharpening Dec 9, 2024 Knowledge Distillation
— Unverified 00 Ultrafast Video Attention Prediction with Coupled Knowledge Distillation Apr 9, 2019 CPU GPU
— Unverified 00 Uncertainty-Aware Cross-Modal Knowledge Distillation with Prototype Learning for Multimodal Brain-Computer Interfaces Jul 17, 2025 EEG Knowledge Distillation
— Unverified 00 Uncertainty-Aware Knowledge Distillation for Compact and Efficient 6DoF Pose Estimation Mar 17, 2025 Autonomous Navigation Knowledge Distillation
— Unverified 00 Uncertainty-Aware Multi-Expert Knowledge Distillation for Imbalanced Disease Grading May 1, 2025 Knowledge Distillation Transfer Learning
— Unverified 00 Uncertainty-Aware Multi-Shot Knowledge Distillation for Image-Based Object Re-Identification Jan 15, 2020 Knowledge Distillation Object
— Unverified 00 Uncertainty-Guided Never-Ending Learning to Drive Jan 1, 2024 Autonomous Driving Continual Learning
— Unverified 00 Understanding Adversarial Attacks on Autoencoders Jan 1, 2021 Compressive Sensing Knowledge Distillation
— Unverified 00