Simultaneous Reward Distillation and Preference Learning: Get You a Language Model Who Can Do Both Oct 11, 2024 Knowledge Distillation Language Modeling
— Unverified 0Single image calibration using knowledge distillation approaches Dec 5, 2022 Camera Calibration Incremental Learning
— Unverified 0Single Snapshot Distillation for Phase Coded Mask Design in Phase Retrieval May 23, 2025 global-optimization Knowledge Distillation
— Unverified 0Single-stage TTS with Masked Audio Token Modeling and Semantic Knowledge Distillation Sep 17, 2024 Knowledge Distillation Speech Synthesis
— Unverified 0SKDBERT: Compressing BERT via Stochastic Knowledge Distillation Nov 26, 2022 Knowledge Distillation Language Modeling
— Unverified 0Sketch Down the FLOPs: Towards Efficient Networks for Human Sketch May 29, 2025 Image Retrieval Knowledge Distillation
— Unverified 0SKILL: Similarity-aware Knowledge distILLation for Speech Self-Supervised Learning Feb 26, 2024 Knowledge Distillation Self-Supervised Learning
— Unverified 0SLaM: Student-Label Mixing for Distillation with Unlabeled Examples Feb 8, 2023 Knowledge Distillation
— Unverified 0SlimSeg: Slimmable Semantic Segmentation with Boundary Supervision Jul 13, 2022 Knowledge Distillation Segmentation
— Unverified 0Smaller, Weaker, Yet Better: Training LLM Reasoners via Compute-Optimal Sampling Aug 29, 2024 Diversity Knowledge Distillation
— Unverified 0Small Language Models are Equation Reasoners Sep 19, 2024 Arithmetic Reasoning Knowledge Distillation
— Unverified 0Small Object Detection: A Comprehensive Survey on Challenges, Techniques and Real-World Applications Mar 26, 2025 Articles Data Augmentation
— Unverified 0Small Vision-Language Models: A Survey on Compact Architectures and Techniques Mar 9, 2025 Computational Efficiency Knowledge Distillation
— Unverified 0Smart Inference for Multidigit Convolutional Neural Network based Barcode Decoding Apr 14, 2020 Knowledge Distillation
— Unverified 0SMOC-Net: Leveraging Camera Pose for Self-Supervised Monocular Object Pose Estimation Jan 1, 2023 6D Pose Estimation using RGB Knowledge Distillation
— Unverified 0Smoothing Out Hallucinations: Mitigating LLM Hallucination with Smoothed Knowledge Distillation Feb 16, 2025 Hallucination Knowledge Distillation
— Unverified 0SnapGen: Taming High-Resolution Text-to-Image Models for Mobile Devices with Efficient Architectures and Training Dec 12, 2024 Knowledge Distillation Text-to-Image Generation
— Unverified 0SNN-PAR: Energy Efficient Pedestrian Attribute Recognition via Spiking Neural Networks Oct 10, 2024 Attribute Knowledge Distillation
— Unverified 0SoccerKDNet: A Knowledge Distillation Framework for Action Recognition in Soccer Videos Jul 15, 2023 Action Recognition Knowledge Distillation
— Unverified 0Soft Knowledge Distillation with Multi-Dimensional Cross-Net Attention for Image Restoration Models Compression Jan 16, 2025 Contrastive Learning Deblurring
— Unverified 0Soft Prompt Decoding for Multilingual Dense Retrieval May 15, 2023 Cross-Lingual Information Retrieval Information Retrieval
— Unverified 0Solvable Model for Inheriting the Regularization through Knowledge Distillation Dec 1, 2020 Knowledge Distillation Transfer Learning
— Unverified 0SonoSAMTrack -- Segment and Track Anything on Ultrasound Images Oct 25, 2023 Knowledge Distillation
— Unverified 0Sorbet: A Neuromorphic Hardware-Compatible Transformer-Based Spiking Language Model Sep 4, 2024 Knowledge Distillation Language Modeling
— Unverified 0Toward Student-Oriented Teacher Network Training For Knowledge Distillation Jun 14, 2022 Data Augmentation Knowledge Distillation
— Unverified 0Source and Target Bidirectional Knowledge Distillation for End-to-end Speech Translation Apr 13, 2021 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Source-Target Unified Knowledge Distillation for Memory-Efficient Federated Domain Adaptation on Edge Devices Sep 29, 2021 Domain Adaptation Knowledge Distillation
— Unverified 0Space-Time Distillation for Video Super-Resolution Jun 19, 2021 Knowledge Distillation Super-Resolution
— Unverified 0Sparse Progressive Distillation: Resolving Overfitting under Pretrain-and-Finetune Paradigm Oct 15, 2021 Knowledge Distillation
— Unverified 0Sparse Progressive Distillation: Resolving Overfitting under Pretrain-and-Finetune Paradigm Nov 16, 2021 Knowledge Distillation
— Unverified 0Spatial Knowledge Distillation to aid Visual Reasoning Dec 10, 2018 Diagnostic Knowledge Distillation
— Unverified 0Spatial Likelihood Voting with Self-Knowledge Distillation for Weakly Supervised Object Detection Apr 14, 2022 Knowledge Distillation Multiple Instance Learning
— Unverified 0Spatio-Temporal Attention Mechanism and Knowledge Distillation for Lip Reading Aug 7, 2021 Audio-Visual Speech Recognition Knowledge Distillation
— Unverified 0Spatio-Temporal Graph for Video Captioning with Knowledge Distillation Mar 31, 2020 Knowledge Distillation Object
— Unverified 0Spatiotemporal Knowledge Distillation for Efficient Estimation of Aerial Video Saliency Apr 10, 2019 GPU Knowledge Distillation
— Unverified 0Speculative Knowledge Distillation: Bridging the Teacher-Student Gap Through Interleaved Sampling Oct 15, 2024 Instruction Following Knowledge Distillation
— Unverified 0Speech Emotion: Investigating Model Representations, Multi-Task Learning and Knowledge Distillation Jul 2, 2022 Knowledge Distillation Multi-Task Learning
— Unverified 0Speech Emotion Recognition with Distilled Prosodic and Linguistic Affect Representations Sep 9, 2023 Emotion Recognition Knowledge Distillation
— Unverified 0Speech Translation with Foundation Models and Optimal Transport: UPC at IWSLT23 Jun 2, 2023 Knowledge Distillation Machine Translation
— Unverified 0Spiking CenterNet: A Distillation-boosted Spiking Neural Network for Object Detection Feb 2, 2024 Decoder Knowledge Distillation
— Unverified 0Spirit Distillation: A Model Compression Method with Multi-domain Knowledge Transfer Apr 29, 2021 General Knowledge Knowledge Distillation
— Unverified 0Spirit Distillation: Precise Real-time Semantic Segmentation of Road Scenes with Insufficient Data Mar 25, 2021 Autonomous Driving Few-Shot Learning
— Unverified 0Split Knowledge Distillation for Large Models in IoT: Architecture, Challenges, and Solutions Dec 17, 2024 Knowledge Distillation Management
— Unverified 0Squeezing nnU-Nets with Knowledge Distillation for On-Board Cloud Detection Jun 16, 2023 Cloud Detection Knowledge Distillation
— Unverified 0SRIL: Selective Regularization for Class-Incremental Learning May 9, 2023 class-incremental learning Class Incremental Learning
— Unverified 0SSKD: Self-Supervised Knowledge Distillation for Cross Domain Adaptive Person Re-Identification Sep 13, 2020 Clustering Domain Adaptive Person Re-Identification
— Unverified 0SSMTL++: Revisiting Self-Supervised Multi-Task Learning for Video Anomaly Detection Jul 16, 2022 Anomaly Detection Knowledge Distillation
— Unverified 0SSR: Enhancing Depth Perception in Vision-Language Models via Rationale-Guided Spatial Reasoning May 18, 2025 Knowledge Distillation Spatial Reasoning
— Unverified 0Stacked Acoustic-and-Textual Encoding: Integrating the Pre-trained Models into Speech Translation Encoders May 12, 2021 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Static Word Embeddings for Sentence Semantic Representation Jun 5, 2025 Contrastive Learning Knowledge Distillation
— Unverified 0