Targeted Forgetting of Image Subgroups in CLIP Models Jan 1, 2025 Knowledge Distillation Unsupervised Pre-training
— Unverified 0TAS: Distilling Arbitrary Teacher and Student via a Hybrid Assistant Oct 16, 2024 Knowledge Distillation Transfer Learning
— Unverified 0Task-Attentive Transformer Architecture for Continual Learning of Vision-and-Language Tasks Using Knowledge Distillation Mar 25, 2023 Continual Learning Knowledge Distillation
— Unverified 0Task-Balanced Distillation for Object Detection Aug 5, 2022 Classification Knowledge Distillation
— Unverified 0TASKED: Transformer-based Adversarial learning for human activity recognition using wearable sensors via Self-KnowledgE Distillation Sep 14, 2022 Activity Recognition Human Activity Recognition
— Unverified 0Task Integration Distillation for Object Detectors Apr 2, 2024 Knowledge Distillation Object
— Unverified 0Task-Specific Knowledge Distillation from the Vision Foundation Model for Enhanced Medical Image Segmentation Mar 10, 2025 Image Segmentation Knowledge Distillation
— Unverified 0Teacher's pet: understanding and mitigating biases in distillation Jun 19, 2021 image-classification Image Classification
— Unverified 0Teacher-Student Architecture for Knowledge Learning: A Survey Oct 28, 2022 Knowledge Distillation Multi-Task Learning
— Unverified 0Teacher-Student Architecture for Knowledge Distillation: A Survey Aug 8, 2023 Knowledge Distillation regression
— Unverified 0Teacher-Student chain for efficient semi-supervised histology image classification Mar 17, 2020 Classification General Classification
— Unverified 0Teacher-Student Knowledge Distillation for Radar Perception on Embedded Accelerators Mar 14, 2023 Knowledge Distillation object-detection
— Unverified 0Distilled Siamese Networks for Visual Tracking Jul 24, 2019 Knowledge Distillation Object Tracking
— Unverified 0Teacher-Student Training and Triplet Loss for Facial Expression Recognition under Occlusion Aug 3, 2020 Facial Expression Recognition Facial Expression Recognition (FER)
— Unverified 0Teacher-Student Training and Triplet Loss to Reduce the Effect of Drastic Face Occlusion Nov 20, 2021 Age Estimation Facial Expression Recognition
— Unverified 0Teacher-Student Training for Robust Tacotron-based TTS Nov 7, 2019 Decoder Knowledge Distillation
— Unverified 0Teaching-Assistant-in-the-Loop: Improving Knowledge Distillation from Imperfect Teacher Models in Low-Budget Scenarios Jun 8, 2024 Knowledge Distillation
— Unverified 0"Teaching Independent Parts Separately" (TIPSy-GAN) : Improving Accuracy and Stability in Unsupervised Adversarial 2D to 3D Pose Estimation May 12, 2022 3D Human Pose Estimation 3D Pose Estimation
— Unverified 0Teaching MLP More Graph Information: A Three-stage Multitask Knowledge Distillation Framework Mar 2, 2024 Knowledge Distillation
— Unverified 0Teaching pathology foundation models to accurately predict gene expression with parameter efficient knowledge transfer Apr 9, 2025 Knowledge Distillation parameter-efficient fine-tuning
— Unverified 0Teaching Small Language Models to Reason Dec 16, 2022 GSM8K Knowledge Distillation
— Unverified 0Teaching with Uncertainty: Unleashing the Potential of Knowledge Distillation in Object Detection Jun 11, 2024 Knowledge Distillation object-detection
— Unverified 0Teach me with a Whisper: Enhancing Large Language Models for Analyzing Spoken Transcripts using Speech Embeddings Nov 13, 2023 Knowledge Distillation Language Modeling
— Unverified 0Teach model to answer questions after comprehending the document Jul 18, 2023 Knowledge Distillation Machine Reading Comprehension
— Unverified 0Technical Report for ICCV 2021 Challenge SSLAD-Track3B: Transformers Are Better Continual Learners Jan 13, 2022 Continual Learning Knowledge Distillation
— Unverified 0Technical Report of Team GraphMIRAcles in the WikiKG90M-LSC Track of OGB-LSC @ KDD Cup 2021 Jul 12, 2021 Knowledge Distillation Knowledge Graphs
— Unverified 0Technical report on Conversational Question Answering Sep 24, 2019 Conversational Question Answering Data Augmentation
— Unverified 0Temporal Knowledge Distillation for On-device Audio Classification Oct 27, 2021 Audio Classification Classification
— Unverified 0Temporal Knowledge Distillation for Time-Sensitive Financial Services Applications Dec 28, 2023 Anomaly Detection Fraud Detection
— Unverified 0Temporal reasoning for timeline summarisation in social media Dec 30, 2024 Knowledge Distillation Timeline Summarization
— Unverified 0Temporal Separation with Entropy Regularization for Knowledge Distillation in Spiking Neural Networks Mar 5, 2025 Computational Efficiency Knowledge Distillation
— Unverified 0TenTrans Large-Scale Multilingual Machine Translation System for WMT21 Nov 1, 2021 Knowledge Distillation Machine Translation
— Unverified 0TernaryLLM: Ternarized Large Language Model Jun 11, 2024 Knowledge Distillation Language Modeling
— Unverified 0Test-Time Adaptation Toward Personalized Speech Enhancement: Zero-Shot Learning with Knowledge Distillation May 8, 2021 Denoising Knowledge Distillation
— Unverified 0Text is Text, No Matter What: Unifying Text Recognition using Knowledge Distillation Jul 26, 2021 Handwriting Recognition HTR
— Unverified 0The Best of Both Worlds: Accurate Global and Personalized Models through Federated Learning with Data-Free Hyper-Knowledge Distillation Jan 21, 2023 Federated Learning Knowledge Distillation
— Unverified 0The economic trade-offs of large language models: A case study Jun 8, 2023 Knowledge Distillation Prompt Engineering
— Unverified 0The Estimation of Continual Causal Effect for Dataset Shifting Streams Apr 29, 2025 counterfactual Knowledge Distillation
— Unverified 0The Graph's Apprentice: Teaching an LLM Low Level Knowledge for Circuit Quality Estimation Oct 30, 2024 Knowledge Distillation
— Unverified 0The LMU Munich System for the WMT 2021 Large-Scale Multilingual Machine Translation Shared Task Nov 1, 2021 Data Augmentation Knowledge Distillation
— Unverified 0The Microsoft Toolkit of Multi-Task Deep Neural Networks for Natural Language Understanding Feb 19, 2020 Knowledge Distillation Multi-Task Learning
— Unverified 0The Mininglamp Machine Translation System for WMT21 Nov 1, 2021 Knowledge Distillation Machine Translation
— Unverified 0The NiuTrans Machine Translation Systems for WMT19 Aug 1, 2019 Knowledge Distillation Machine Translation
— Unverified 0The NiuTrans Machine Translation Systems for WMT21 Sep 22, 2021 Knowledge Distillation Machine Translation
— Unverified 0The NiuTrans Machine Translation Systems for WMT20 Nov 1, 2020 Knowledge Distillation Machine Translation
— Unverified 0The NiuTrans System for the WMT 2021 Efficiency Task Nov 1, 2021 GPU Knowledge Distillation
— Unverified 0The NLP Cookbook: Modern Recipes for Transformer based Deep Learning Architectures Mar 23, 2021 Information Retrieval Knowledge Distillation
— Unverified 0Theoretical Guarantees for LT-TTD: A Unified Transformer-based Architecture for Two-Level Ranking Systems May 7, 2025 Computational Efficiency Knowledge Distillation
— Unverified 0The Privileged Students: On the Value of Initialization in Multilingual Knowledge Distillation Jun 24, 2024 Knowledge Distillation
— Unverified 0The RoyalFlush System for the WMT 2022 Efficiency Task Dec 3, 2022 Decoder GPU
— Unverified 0