TrustAL: Trustworthy Active Learning using Knowledge Distillation Jan 26, 2022 Active Learning Diversity
— Unverified 0TSAK: Two-Stage Semantic-Aware Knowledge Distillation for Efficient Wearable Modality and Model Optimization in Manufacturing Lines Aug 26, 2024 Activity Recognition Human Activity Recognition
— Unverified 0TS-HTFA: Advancing Time Series Forecasting via Hierarchical Text-Free Alignment with Large Language Models Sep 23, 2024 Contrastive Learning cross-modal alignment
— Unverified 0TT-MPD: Test Time Model Pruning and Distillation Dec 10, 2024 Knowledge Distillation model
— Unverified 0TTT-KD: Test-Time Training for 3D Semantic Segmentation through Knowledge Distillation from Foundation Models Mar 18, 2024 3D Semantic Segmentation Knowledge Distillation
— Unverified 0Turbo2K: Towards Ultra-Efficient and High-Quality 2K Video Synthesis Apr 20, 2025 2k Knowledge Distillation
— Unverified 0TutorNet: Towards Flexible Knowledge Distillation for End-to-End Speech Recognition Aug 3, 2020 Knowledge Distillation Model Compression
— Unverified 0Twin Network Augmentation: A Novel Training Strategy for Improved Spiking Neural Networks and Efficient Weight Quantization Sep 24, 2024 Knowledge Distillation Quantization
— Unverified 0Two-in-one Knowledge Distillation for Efficient Facial Forgery Detection Feb 21, 2023 Knowledge Distillation Vocal Bursts Valence Prediction
— Unverified 0Two-Pass End-to-End ASR Model Compression Jan 8, 2022 Decoder Knowledge Distillation
— Unverified 0Two-Stage Multi-task Self-Supervised Learning for Medical Image Segmentation Feb 11, 2024 Auxiliary Learning Image Segmentation
— Unverified 0Two-Step Knowledge Distillation for Tiny Speech Enhancement Sep 15, 2023 Knowledge Distillation Model Compression
— Unverified 0UB-FineNet: Urban Building Fine-grained Classification Network for Open-access Satellite Images Mar 4, 2024 Classification Denoising
— Unverified 0Multi-trial Neural Architecture Search with Lottery Tickets Mar 8, 2022 Knowledge Distillation Neural Architecture Search
— Unverified 0UIFormer: A Unified Transformer-based Framework for Incremental Few-Shot Object Detection and Instance Segmentation Nov 13, 2024 Decoder Few-Shot Object Detection
— Unverified 0UKD: Debiasing Conversion Rate Estimation via Uncertainty-regularized Knowledge Distillation Jan 20, 2022 Knowledge Distillation Selection bias
— Unverified 0U-Know-DiffPAN: An Uncertainty-aware Knowledge Distillation Diffusion Framework with Details Enhancement for PAN-Sharpening Dec 9, 2024 Knowledge Distillation
— Unverified 0Ultrafast Video Attention Prediction with Coupled Knowledge Distillation Apr 9, 2019 CPU GPU
— Unverified 0Uncertainty-Aware Cross-Modal Knowledge Distillation with Prototype Learning for Multimodal Brain-Computer Interfaces Jul 17, 2025 EEG Knowledge Distillation
— Unverified 0Uncertainty-Aware Knowledge Distillation for Compact and Efficient 6DoF Pose Estimation Mar 17, 2025 Autonomous Navigation Knowledge Distillation
— Unverified 0Uncertainty-Aware Multi-Expert Knowledge Distillation for Imbalanced Disease Grading May 1, 2025 Knowledge Distillation Transfer Learning
— Unverified 0Uncertainty-Aware Multi-Shot Knowledge Distillation for Image-Based Object Re-Identification Jan 15, 2020 Knowledge Distillation Object
— Unverified 0Uncertainty-Guided Never-Ending Learning to Drive Jan 1, 2024 Autonomous Driving Continual Learning
— Unverified 0Understanding Adversarial Attacks on Autoencoders Jan 1, 2021 Compressive Sensing Knowledge Distillation
— Unverified 0Understanding and Improving Knowledge Distillation Feb 10, 2020 Knowledge Distillation Model Compression
— Unverified 0Understanding and Improving Lexical Choice in Non-Autoregressive Translation Dec 29, 2020 Knowledge Distillation Translation
— Unverified 0Understanding Knowledge Distillation Jan 1, 2021 Knowledge Distillation
— Unverified 0Understanding Knowledge Distillation in Non-autoregressive Machine Translation Nov 7, 2019 Knowledge Distillation Machine Translation
— Unverified 0Understanding the Effect of Data Augmentation on Knowledge Distillation May 21, 2023 Data Augmentation Knowledge Distillation
— Unverified 0Understanding the Gains from Repeated Self-Distillation Jul 5, 2024 Knowledge Distillation regression
— Unverified 0Understanding the Overfitting of the Episodic Meta-training Jun 29, 2023 Knowledge Distillation
— Unverified 0Understanding the Success of Knowledge Distillation -- A Data Augmentation Perspective Sep 29, 2021 Active Learning Data Augmentation
— Unverified 0UNDO: Understanding Distillation as Optimization Apr 3, 2025 Knowledge Distillation
— Unverified 0UniCompress: Enhancing Multi-Data Medical Image Compression with Knowledge Distillation May 27, 2024 Image Compression Knowledge Distillation
— Unverified 0UNIDEAL: Curriculum Knowledge Distillation Federated Learning Sep 16, 2023 Federated Learning Knowledge Distillation
— Unverified 0Unified and Effective Ensemble Knowledge Distillation Apr 1, 2022 Knowledge Distillation Transfer Learning
— Unverified 0Unified Anomaly Detection methods on Edge Device using Knowledge Distillation and Quantization Jul 3, 2024 Anomaly Detection CPU
— Unverified 0Unified Attacks to Large Language Model Watermarks: Spoofing and Scrubbing in Unauthorized Knowledge Distillation Apr 24, 2025 Knowledge Distillation Language Modeling
— Unverified 0Unified Locomotion Transformer with Simultaneous Sim-to-Real Transfer for Quadrupeds Mar 12, 2025 Deep Reinforcement Learning Knowledge Distillation
— Unverified 0UniKD: Universal Knowledge Distillation for Mimicking Homogeneous or Heterogeneous Object Detectors Jan 1, 2023 Knowledge Distillation
— Unverified 0Unimodal-driven Distillation in Multimodal Emotion Recognition with Dynamic Fusion Mar 31, 2025 Emotion Recognition Knowledge Distillation
— Unverified 0UniMS: A Unified Framework for Multimodal Summarization with Knowledge Distillation Sep 13, 2021 Abstractive Text Summarization Decoder
— Unverified 0Uni-Retriever: Towards Learning The Unified Embedding Based Retriever in Bing Sponsored Search Feb 13, 2022 Contrastive Learning Knowledge Distillation
— Unverified 0Dual-mode ASR: Unify and Improve Streaming ASR with Full-context Modeling Oct 12, 2020 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Universal-KD: Attention-based Output-Grounded Intermediate Layer Knowledge Distillation Nov 1, 2021 Knowledge Distillation
— Unverified 0Unlabeled Data Deployment for Classification of Diabetic Retinopathy Images Using Knowledge Transfer Feb 9, 2020 General Classification Knowledge Distillation
— Unverified 0Unlearning Clients, Features and Samples in Vertical Federated Learning Jan 23, 2025 Federated Learning Inference Attack
— Unverified 0Unlearning via Sparse Representations Nov 26, 2023 Knowledge Distillation
— Unverified 0Unleashing the Potential of Mamba: Boosting a LiDAR 3D Sparse Detector by Using Cross-Model Knowledge Distillation Sep 17, 2024 3D Object Detection Autonomous Driving
— Unverified 0Unlimited Knowledge Distillation for Action Recognition in the Dark Aug 18, 2023 Action Recognition GPU
— Unverified 0