Toward Student-Oriented Teacher Network Training For Knowledge Distillation Jun 14, 2022 Data Augmentation Knowledge Distillation
— Unverified 00 Source and Target Bidirectional Knowledge Distillation for End-to-end Speech Translation Apr 13, 2021 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 Source-Target Unified Knowledge Distillation for Memory-Efficient Federated Domain Adaptation on Edge Devices Sep 29, 2021 Domain Adaptation Knowledge Distillation
— Unverified 00 Space-Time Distillation for Video Super-Resolution Jun 19, 2021 Knowledge Distillation Super-Resolution
— Unverified 00 Sparse Progressive Distillation: Resolving Overfitting under Pretrain-and-Finetune Paradigm Oct 15, 2021 Knowledge Distillation
— Unverified 00 Sparse Progressive Distillation: Resolving Overfitting under Pretrain-and-Finetune Paradigm Nov 16, 2021 Knowledge Distillation
— Unverified 00 Spatial Knowledge Distillation to aid Visual Reasoning Dec 10, 2018 Diagnostic Knowledge Distillation
— Unverified 00 Spatial Likelihood Voting with Self-Knowledge Distillation for Weakly Supervised Object Detection Apr 14, 2022 Knowledge Distillation Multiple Instance Learning
— Unverified 00 Spatio-Temporal Attention Mechanism and Knowledge Distillation for Lip Reading Aug 7, 2021 Audio-Visual Speech Recognition Knowledge Distillation
— Unverified 00 Spatio-Temporal Graph for Video Captioning with Knowledge Distillation Mar 31, 2020 Knowledge Distillation Object
— Unverified 00 Spatiotemporal Knowledge Distillation for Efficient Estimation of Aerial Video Saliency Apr 10, 2019 GPU Knowledge Distillation
— Unverified 00 Speculative Knowledge Distillation: Bridging the Teacher-Student Gap Through Interleaved Sampling Oct 15, 2024 Instruction Following Knowledge Distillation
— Unverified 00 Speech Emotion: Investigating Model Representations, Multi-Task Learning and Knowledge Distillation Jul 2, 2022 Knowledge Distillation Multi-Task Learning
— Unverified 00 Speech Emotion Recognition with Distilled Prosodic and Linguistic Affect Representations Sep 9, 2023 Emotion Recognition Knowledge Distillation
— Unverified 00 Speech Translation with Foundation Models and Optimal Transport: UPC at IWSLT23 Jun 2, 2023 Knowledge Distillation Machine Translation
— Unverified 00 Spiking CenterNet: A Distillation-boosted Spiking Neural Network for Object Detection Feb 2, 2024 Decoder Knowledge Distillation
— Unverified 00 Spirit Distillation: A Model Compression Method with Multi-domain Knowledge Transfer Apr 29, 2021 General Knowledge Knowledge Distillation
— Unverified 00 Spirit Distillation: Precise Real-time Semantic Segmentation of Road Scenes with Insufficient Data Mar 25, 2021 Autonomous Driving Few-Shot Learning
— Unverified 00 Split Knowledge Distillation for Large Models in IoT: Architecture, Challenges, and Solutions Dec 17, 2024 Knowledge Distillation Management
— Unverified 00 Squeezing nnU-Nets with Knowledge Distillation for On-Board Cloud Detection Jun 16, 2023 Cloud Detection Knowledge Distillation
— Unverified 00 SRIL: Selective Regularization for Class-Incremental Learning May 9, 2023 class-incremental learning Class Incremental Learning
— Unverified 00 SSKD: Self-Supervised Knowledge Distillation for Cross Domain Adaptive Person Re-Identification Sep 13, 2020 Clustering Domain Adaptive Person Re-Identification
— Unverified 00 SSMTL++: Revisiting Self-Supervised Multi-Task Learning for Video Anomaly Detection Jul 16, 2022 Anomaly Detection Knowledge Distillation
— Unverified 00 SSR: Enhancing Depth Perception in Vision-Language Models via Rationale-Guided Spatial Reasoning May 18, 2025 Knowledge Distillation Spatial Reasoning
— Unverified 00 Stacked Acoustic-and-Textual Encoding: Integrating the Pre-trained Models into Speech Translation Encoders May 12, 2021 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 Static Word Embeddings for Sentence Semantic Representation Jun 5, 2025 Contrastive Learning Knowledge Distillation
— Unverified 00 Stealing Neural Networks via Timing Side Channels Dec 31, 2018 Knowledge Distillation Reinforcement Learning
— Unverified 00 Step Out and Seek Around: On Warm-Start Training with Incremental Data Jun 6, 2024 Autonomous Driving Knowledge Distillation
— Unverified 00 Stereo-Knowledge Distillation from dpMV to Dual Pixels for Light Field Video Reconstruction May 20, 2024 Autonomous Driving Knowledge Distillation
— Unverified 00 Stereo-Matching Knowledge Distilled Monocular Depth Estimation Filtered by Multiple Disparity Consistency Jan 22, 2024 Depth Estimation Knowledge Distillation
— Unverified 00 STEVE Series: Step-by-Step Construction of Agent Systems in Minecraft Jun 17, 2024 Knowledge Distillation Language Modeling
— Unverified 00 Stingy Teacher: Sparse Logits Suffice to Fail Knowledge Distillation Sep 29, 2021 Knowledge Distillation
— Unverified 00 Stochastic Precision Ensemble: Self-Knowledge Distillation for Quantized Deep Neural Networks Sep 30, 2020 image-classification Image Classification
— Unverified 00 Strategic Fusion Optimizes Transformer Compression Jan 5, 2025 Knowledge Distillation Model Compression
— Unverified 00 Streaming egocentric action anticipation: An evaluation scheme and approach Jun 29, 2023 Action Anticipation Knowledge Distillation
— Unverified 00 Streaming Transformer ASR with Blockwise Synchronous Inference Jun 25, 2020 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 Structural and Statistical Texture Knowledge Distillation for Semantic Segmentation May 6, 2023 Knowledge Distillation Quantization
— Unverified 00 Structural Knowledge Distillation for Object Detection Nov 23, 2022 Feature Importance Knowledge Distillation
— Unverified 00 Structural Teacher-Student Normality Learning for Multi-Class Anomaly Detection and Localization Feb 27, 2024 Anomaly Detection Knowledge Distillation
— Unverified 00 Structure Aware Incremental Learning with Personalized Imitation Weights for Recommender Systems May 2, 2023 Incremental Learning Knowledge Distillation
— Unverified 00 Structure-Centric Robust Monocular Depth Estimation via Knowledge Distillation Oct 9, 2024 Depth Estimation Knowledge Distillation
— Unverified 00 Structured Knowledge Distillation Towards Efficient and Compact Multi-View 3D Detection Nov 14, 2022 Knowledge Distillation
— Unverified 00 Structured Pruning of Neural Networks with Budget-Aware Regularization Nov 23, 2018 Knowledge Distillation
— Unverified 00 StructVPR: Distill Structural Knowledge with Weighting Samples for Visual Place Recognition Dec 2, 2022 Image Retrieval Knowledge Distillation
— Unverified 00 Student as an Inherent Denoiser of Noisy Teacher Dec 15, 2023 Knowledge Distillation Language Modeling
— Unverified 00 Student Customized Knowledge Distillation: Bridging the Gap Between Student and Teacher Jan 1, 2021 image-classification Image Classification
— Unverified 00 Student-friendly Knowledge Distillation May 18, 2023 Knowledge Distillation
— Unverified 00 Student Network Learning via Evolutionary Knowledge Distillation Mar 23, 2021 Knowledge Distillation Transfer Learning
— Unverified 00 Student-Oriented Teacher Knowledge Refinement for Knowledge Distillation Sep 27, 2024 Knowledge Distillation Transfer Learning
— Unverified 00 Students Parrot Their Teachers: Membership Inference on Model Distillation Mar 6, 2023 Knowledge Distillation
— Unverified 00