Students taught by multimodal teachers are superior action recognizers Oct 9, 2022 Action Recognition Knowledge Distillation
— Unverified 00 Students Who Study Together Learn Better: On the Importance of Collective Knowledge Distillation for Domain Transfer in Fact Verification Nov 1, 2021 Fact Verification Knowledge Distillation
— Unverified 00 Study of Encoder-Decoder Architectures for Code-Mix Search Query Translation Aug 7, 2022 Data Augmentation Decoder
— Unverified 00 Style over Substance: Distilled Language Models Reason Via Stylistic Replication Apr 2, 2025 Knowledge Distillation
— Unverified 00 Sub-Band Knowledge Distillation Framework for Speech Enhancement May 29, 2020 Knowledge Distillation Speech Enhancement
— Unverified 00 Subclass Knowledge Distillation with Known Subclass Labels Jul 17, 2022 Binary Classification Knowledge Distillation
— Unverified 00 Sub-Graph Learning for Spatiotemporal Forecasting via Knowledge Distillation Nov 17, 2022 Diversity Graph Learning
— Unverified 00 SUGAR: Pre-training 3D Visual Representations for Robotics Apr 1, 2024 3D Instance Segmentation 3D Object Recognition
— Unverified 00 Supervised Graph Contrastive Pretraining for Text Classification Dec 21, 2021 Classification Contrastive Learning
— Unverified 00 Supervision Complexity and its Role in Knowledge Distillation Jan 28, 2023 image-classification Image Classification
— Unverified 00 Supporting Cross-language Cross-project Bug Localization Using Pre-trained Language Models Jul 3, 2024 Contrastive Learning CPU
— Unverified 00 Knowledge Distillation in Federated Edge Learning: A Survey Jan 14, 2023 Knowledge Distillation Survey
— Unverified 00 Survey on Knowledge Distillation for Large Language Models: Methods, Evaluation, and Application Jul 2, 2024 Knowledge Distillation Survey
— Unverified 00 Swing Distillation: A Privacy-Preserving Knowledge Distillation Framework Dec 16, 2022 Knowledge Distillation Model Compression
— Unverified 00 SWITCH: Studying with Teacher for Knowledge Distillation of Large Language Models Oct 25, 2024 Instruction Following Knowledge Distillation
— Unverified 00 Synergic Adversarial Label Learning for Grading Retinal Diseases via Knowledge Distillation and Multi-task Learning Mar 24, 2020 Classification General Classification
— Unverified 00 Synergistic Effects of Knowledge Distillation and Structured Pruning for Self-Supervised Speech Models Feb 9, 2025 Knowledge Distillation Model Compression
— Unverified 00 Syntactic Structure Distillation Pretraining For Bidirectional Encoders May 27, 2020 Knowledge Distillation Language Modeling
— Unverified 00 Synthetic Image Learning: Preserving Performance and Preventing Membership Inference Attacks Jul 22, 2024 Knowledge Distillation
— Unverified 00 Synthetic Unknown Class Learning for Learning Unknowns Nov 15, 2021 Diversity Knowledge Distillation
— Unverified 00 TAID: Temporally Adaptive Interpolated Distillation for Efficient Knowledge Transfer in Language Models Jan 28, 2025 Knowledge Distillation Model Compression
— Unverified 00 Tailored Federated Learning: Leveraging Direction Regulation & Knowledge Distillation Sep 29, 2024 Federated Learning Knowledge Distillation
— Unverified 00 Take a Prior from Other Tasks for Severe Blur Removal Feb 14, 2023 Deblurring Image Deblurring
— Unverified 00 TalkingMachines: Real-Time Audio-Driven FaceTime-Style Video via Autoregressive Diffusion Models Jun 3, 2025 Decoder Knowledge Distillation
— Unverified 00 Talking Models: Distill Pre-trained Knowledge to Downstream Models via Interactive Communication Oct 4, 2023 Decoder Knowledge Distillation
— Unverified 00 Target-driven Self-Distillation for Partial Observed Trajectories Forecasting Jan 28, 2025 Autonomous Driving Knowledge Distillation
— Unverified 00 Targeted Forgetting of Image Subgroups in CLIP Models Jan 1, 2025 Knowledge Distillation Unsupervised Pre-training
— Unverified 00 TAS: Distilling Arbitrary Teacher and Student via a Hybrid Assistant Oct 16, 2024 Knowledge Distillation Transfer Learning
— Unverified 00 Task-Attentive Transformer Architecture for Continual Learning of Vision-and-Language Tasks Using Knowledge Distillation Mar 25, 2023 Continual Learning Knowledge Distillation
— Unverified 00 Task-Balanced Distillation for Object Detection Aug 5, 2022 Classification Knowledge Distillation
— Unverified 00 TASKED: Transformer-based Adversarial learning for human activity recognition using wearable sensors via Self-KnowledgE Distillation Sep 14, 2022 Activity Recognition Human Activity Recognition
— Unverified 00 Task Integration Distillation for Object Detectors Apr 2, 2024 Knowledge Distillation Object
— Unverified 00 Task-Specific Knowledge Distillation from the Vision Foundation Model for Enhanced Medical Image Segmentation Mar 10, 2025 Image Segmentation Knowledge Distillation
— Unverified 00 Teacher's pet: understanding and mitigating biases in distillation Jun 19, 2021 image-classification Image Classification
— Unverified 00 Teacher-Student Architecture for Knowledge Learning: A Survey Oct 28, 2022 Knowledge Distillation Multi-Task Learning
— Unverified 00 Teacher-Student Architecture for Knowledge Distillation: A Survey Aug 8, 2023 Knowledge Distillation regression
— Unverified 00 Teacher-Student chain for efficient semi-supervised histology image classification Mar 17, 2020 Classification General Classification
— Unverified 00 Teacher-Student Knowledge Distillation for Radar Perception on Embedded Accelerators Mar 14, 2023 Knowledge Distillation object-detection
— Unverified 00 Distilled Siamese Networks for Visual Tracking Jul 24, 2019 Knowledge Distillation Object Tracking
— Unverified 00 Teacher-Student Training and Triplet Loss for Facial Expression Recognition under Occlusion Aug 3, 2020 Facial Expression Recognition Facial Expression Recognition (FER)
— Unverified 00 Teacher-Student Training and Triplet Loss to Reduce the Effect of Drastic Face Occlusion Nov 20, 2021 Age Estimation Facial Expression Recognition
— Unverified 00 Teacher-Student Training for Robust Tacotron-based TTS Nov 7, 2019 Decoder Knowledge Distillation
— Unverified 00 Teaching-Assistant-in-the-Loop: Improving Knowledge Distillation from Imperfect Teacher Models in Low-Budget Scenarios Jun 8, 2024 Knowledge Distillation
— Unverified 00 "Teaching Independent Parts Separately" (TIPSy-GAN) : Improving Accuracy and Stability in Unsupervised Adversarial 2D to 3D Pose Estimation May 12, 2022 3D Human Pose Estimation 3D Pose Estimation
— Unverified 00 Teaching MLP More Graph Information: A Three-stage Multitask Knowledge Distillation Framework Mar 2, 2024 Knowledge Distillation
— Unverified 00 Teaching pathology foundation models to accurately predict gene expression with parameter efficient knowledge transfer Apr 9, 2025 Knowledge Distillation parameter-efficient fine-tuning
— Unverified 00 Teaching Small Language Models to Reason Dec 16, 2022 GSM8K Knowledge Distillation
— Unverified 00 Teaching with Uncertainty: Unleashing the Potential of Knowledge Distillation in Object Detection Jun 11, 2024 Knowledge Distillation object-detection
— Unverified 00 Teach me with a Whisper: Enhancing Large Language Models for Analyzing Spoken Transcripts using Speech Embeddings Nov 13, 2023 Knowledge Distillation Language Modeling
— Unverified 00 Teach model to answer questions after comprehending the document Jul 18, 2023 Knowledge Distillation Machine Reading Comprehension
— Unverified 00