Stealing Neural Networks via Timing Side Channels Dec 31, 2018 Knowledge Distillation Reinforcement Learning
— Unverified 0Step Out and Seek Around: On Warm-Start Training with Incremental Data Jun 6, 2024 Autonomous Driving Knowledge Distillation
— Unverified 0Stereo-Knowledge Distillation from dpMV to Dual Pixels for Light Field Video Reconstruction May 20, 2024 Autonomous Driving Knowledge Distillation
— Unverified 0Stereo-Matching Knowledge Distilled Monocular Depth Estimation Filtered by Multiple Disparity Consistency Jan 22, 2024 Depth Estimation Knowledge Distillation
— Unverified 0STEVE Series: Step-by-Step Construction of Agent Systems in Minecraft Jun 17, 2024 Knowledge Distillation Language Modeling
— Unverified 0Stingy Teacher: Sparse Logits Suffice to Fail Knowledge Distillation Sep 29, 2021 Knowledge Distillation
— Unverified 0Stochastic Precision Ensemble: Self-Knowledge Distillation for Quantized Deep Neural Networks Sep 30, 2020 image-classification Image Classification
— Unverified 0Strategic Fusion Optimizes Transformer Compression Jan 5, 2025 Knowledge Distillation Model Compression
— Unverified 0Streaming egocentric action anticipation: An evaluation scheme and approach Jun 29, 2023 Action Anticipation Knowledge Distillation
— Unverified 0Streaming Transformer ASR with Blockwise Synchronous Inference Jun 25, 2020 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Structural and Statistical Texture Knowledge Distillation for Semantic Segmentation May 6, 2023 Knowledge Distillation Quantization
— Unverified 0Structural Knowledge Distillation for Object Detection Nov 23, 2022 Feature Importance Knowledge Distillation
— Unverified 0Structural Teacher-Student Normality Learning for Multi-Class Anomaly Detection and Localization Feb 27, 2024 Anomaly Detection Knowledge Distillation
— Unverified 0Structure Aware Incremental Learning with Personalized Imitation Weights for Recommender Systems May 2, 2023 Incremental Learning Knowledge Distillation
— Unverified 0Structure-Centric Robust Monocular Depth Estimation via Knowledge Distillation Oct 9, 2024 Depth Estimation Knowledge Distillation
— Unverified 0Structured Knowledge Distillation Towards Efficient and Compact Multi-View 3D Detection Nov 14, 2022 Knowledge Distillation
— Unverified 0Structured Pruning of Neural Networks with Budget-Aware Regularization Nov 23, 2018 Knowledge Distillation
— Unverified 0StructVPR: Distill Structural Knowledge with Weighting Samples for Visual Place Recognition Dec 2, 2022 Image Retrieval Knowledge Distillation
— Unverified 0Student as an Inherent Denoiser of Noisy Teacher Dec 15, 2023 Knowledge Distillation Language Modeling
— Unverified 0Student Customized Knowledge Distillation: Bridging the Gap Between Student and Teacher Jan 1, 2021 image-classification Image Classification
— Unverified 0Student-friendly Knowledge Distillation May 18, 2023 Knowledge Distillation
— Unverified 0Student Network Learning via Evolutionary Knowledge Distillation Mar 23, 2021 Knowledge Distillation Transfer Learning
— Unverified 0Student-Oriented Teacher Knowledge Refinement for Knowledge Distillation Sep 27, 2024 Knowledge Distillation Transfer Learning
— Unverified 0Students Parrot Their Teachers: Membership Inference on Model Distillation Mar 6, 2023 Knowledge Distillation
— Unverified 0Students taught by multimodal teachers are superior action recognizers Oct 9, 2022 Action Recognition Knowledge Distillation
— Unverified 0Students Who Study Together Learn Better: On the Importance of Collective Knowledge Distillation for Domain Transfer in Fact Verification Nov 1, 2021 Fact Verification Knowledge Distillation
— Unverified 0Study of Encoder-Decoder Architectures for Code-Mix Search Query Translation Aug 7, 2022 Data Augmentation Decoder
— Unverified 0Style over Substance: Distilled Language Models Reason Via Stylistic Replication Apr 2, 2025 Knowledge Distillation
— Unverified 0Sub-Band Knowledge Distillation Framework for Speech Enhancement May 29, 2020 Knowledge Distillation Speech Enhancement
— Unverified 0Subclass Knowledge Distillation with Known Subclass Labels Jul 17, 2022 Binary Classification Knowledge Distillation
— Unverified 0Sub-Graph Learning for Spatiotemporal Forecasting via Knowledge Distillation Nov 17, 2022 Diversity Graph Learning
— Unverified 0SUGAR: Pre-training 3D Visual Representations for Robotics Apr 1, 2024 3D Instance Segmentation 3D Object Recognition
— Unverified 0Supervised Graph Contrastive Pretraining for Text Classification Dec 21, 2021 Classification Contrastive Learning
— Unverified 0Supervision Complexity and its Role in Knowledge Distillation Jan 28, 2023 image-classification Image Classification
— Unverified 0Supporting Cross-language Cross-project Bug Localization Using Pre-trained Language Models Jul 3, 2024 Contrastive Learning CPU
— Unverified 0Knowledge Distillation in Federated Edge Learning: A Survey Jan 14, 2023 Knowledge Distillation Survey
— Unverified 0Survey on Knowledge Distillation for Large Language Models: Methods, Evaluation, and Application Jul 2, 2024 Knowledge Distillation Survey
— Unverified 0Swing Distillation: A Privacy-Preserving Knowledge Distillation Framework Dec 16, 2022 Knowledge Distillation Model Compression
— Unverified 0SWITCH: Studying with Teacher for Knowledge Distillation of Large Language Models Oct 25, 2024 Instruction Following Knowledge Distillation
— Unverified 0Synergic Adversarial Label Learning for Grading Retinal Diseases via Knowledge Distillation and Multi-task Learning Mar 24, 2020 Classification General Classification
— Unverified 0Synergistic Effects of Knowledge Distillation and Structured Pruning for Self-Supervised Speech Models Feb 9, 2025 Knowledge Distillation Model Compression
— Unverified 0Syntactic Structure Distillation Pretraining For Bidirectional Encoders May 27, 2020 Knowledge Distillation Language Modeling
— Unverified 0Synthetic Image Learning: Preserving Performance and Preventing Membership Inference Attacks Jul 22, 2024 Knowledge Distillation
— Unverified 0Synthetic Unknown Class Learning for Learning Unknowns Nov 15, 2021 Diversity Knowledge Distillation
— Unverified 0TAID: Temporally Adaptive Interpolated Distillation for Efficient Knowledge Transfer in Language Models Jan 28, 2025 Knowledge Distillation Model Compression
— Unverified 0Tailored Federated Learning: Leveraging Direction Regulation & Knowledge Distillation Sep 29, 2024 Federated Learning Knowledge Distillation
— Unverified 0Take a Prior from Other Tasks for Severe Blur Removal Feb 14, 2023 Deblurring Image Deblurring
— Unverified 0TalkingMachines: Real-Time Audio-Driven FaceTime-Style Video via Autoregressive Diffusion Models Jun 3, 2025 Decoder Knowledge Distillation
— Unverified 0Talking Models: Distill Pre-trained Knowledge to Downstream Models via Interactive Communication Oct 4, 2023 Decoder Knowledge Distillation
— Unverified 0Target-driven Self-Distillation for Partial Observed Trajectories Forecasting Jan 28, 2025 Autonomous Driving Knowledge Distillation
— Unverified 0