Semi-UFormer: Semi-supervised Uncertainty-aware Transformer for Image Dehazing Oct 28, 2022 Image Dehazing Knowledge Distillation
— Unverified 00 Sentence Embeddings by Ensemble Distillation Apr 14, 2021 Knowledge Distillation Semantic Textual Similarity
— Unverified 00 Sentence-Level or Token-Level? A Comprehensive Study on Knowledge Distillation Apr 23, 2024 Knowledge Distillation Machine Translation
— Unverified 00 Sentiment Interpretable Logic Tensor Network for Aspect-Term Sentiment Analysis Oct 1, 2022 Computational Efficiency Knowledge Distillation
— Unverified 00 SepALM: Audio Language Models Are Error Correctors for Robust Speech Separation May 6, 2025 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 Separating Novel Features for Logical Anomaly Detection: A Straightforward yet Effective Approach Jul 25, 2024 Anomaly Detection Knowledge Distillation
— Unverified 00 SeqPATE: Differentially Private Text Generation via Knowledge Distillation Sep 29, 2021 Knowledge Distillation Sentence
— Unverified 00 Sequence-Level Knowledge Distillation for Model Compression of Attention-based Sequence-to-Sequence Speech Recognition Nov 12, 2018 Knowledge Distillation Model Compression
— Unverified 00 Sequence-Level Knowledge Distillation for Class-Incremental End-to-End Spoken Language Understanding May 23, 2023 Continual Learning Decoder
— Unverified 00 Sequential Editing for Lifelong Training of Speech Recognition Models Jun 25, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 Sewer Image Super-Resolution with Depth Priors and Its Lightweight Network Jul 27, 2024 Computational Efficiency Image Super-Resolution
— Unverified 00 SFedKD: Sequential Federated Learning with Discrepancy-Aware Multi-Teacher Knowledge Distillation Jul 11, 2025 Federated Learning Knowledge Distillation
— Unverified 00 Shape-Net: Room Layout Estimation from Panoramic Images Robust to Occlusion using Knowledge Distillation with 3D Shapes as Additional Inputs Apr 25, 2023 3D geometry 3D Reconstruction
— Unverified 00 Shared Growth of Graph Neural Networks via Prompted Free-direction Knowledge Distillation Jul 2, 2023 Knowledge Distillation Prompt Learning
— Unverified 00 Shoggoth: Towards Efficient Edge-Cloud Collaborative Real-Time Video Inference via Adaptive Online Learning Jun 27, 2023 Knowledge Distillation
— Unverified 00 Siamese Sleep Transformer For Robust Sleep Stage Scoring With Self-knowledge Distillation and Selective Batch Sampling Dec 12, 2022 Knowledge Distillation Self-Knowledge Distillation
— Unverified 00 SIGN: Spatial-information Incorporated Generative Network for Generalized Zero-shot Semantic Segmentation Aug 27, 2021 Knowledge Distillation Segmentation
— Unverified 00 Similarity of Neural Architectures using Adversarial Attack Transferability Oct 20, 2022 Adversarial Attack Diversity
— Unverified 00 Similarity-Preserving Knowledge Distillation Jul 23, 2019 Knowledge Distillation Neural Network Compression
— Unverified 00 Similarity Transfer for Knowledge Distillation Mar 18, 2021 Knowledge Distillation
— Unverified 00 Simple Regularisation for Uncertainty-Aware Knowledge Distillation May 19, 2022 BIG-bench Machine Learning Diversity
— Unverified 00 Simple Unsupervised Knowledge Distillation With Space Similarity Sep 20, 2024 Knowledge Distillation Self-Supervised Learning
— Unverified 00 Simplification Is All You Need against Out-of-Distribution Overconfidence Jan 1, 2025 All Attribute
— Unverified 00 Simplifying CLIP: Unleashing the Power of Large-Scale Models on Consumer-level Computers Nov 22, 2024 Data Augmentation GPU
— Unverified 00 Sim-to-Real Transfer in Deep Reinforcement Learning for Robotics: a Survey Sep 24, 2020 Deep Reinforcement Learning Domain Adaptation
— Unverified 00 SimulSpeech: End-to-End Simultaneous Speech to Text Translation Jul 1, 2020 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 Simultaneous Reward Distillation and Preference Learning: Get You a Language Model Who Can Do Both Oct 11, 2024 Knowledge Distillation Language Modeling
— Unverified 00 Single image calibration using knowledge distillation approaches Dec 5, 2022 Camera Calibration Incremental Learning
— Unverified 00 Single Snapshot Distillation for Phase Coded Mask Design in Phase Retrieval May 23, 2025 global-optimization Knowledge Distillation
— Unverified 00 Single-stage TTS with Masked Audio Token Modeling and Semantic Knowledge Distillation Sep 17, 2024 Knowledge Distillation Speech Synthesis
— Unverified 00 SKDBERT: Compressing BERT via Stochastic Knowledge Distillation Nov 26, 2022 Knowledge Distillation Language Modeling
— Unverified 00 Sketch Down the FLOPs: Towards Efficient Networks for Human Sketch May 29, 2025 Image Retrieval Knowledge Distillation
— Unverified 00 SKILL: Similarity-aware Knowledge distILLation for Speech Self-Supervised Learning Feb 26, 2024 Knowledge Distillation Self-Supervised Learning
— Unverified 00 SLaM: Student-Label Mixing for Distillation with Unlabeled Examples Feb 8, 2023 Knowledge Distillation
— Unverified 00 SlimSeg: Slimmable Semantic Segmentation with Boundary Supervision Jul 13, 2022 Knowledge Distillation Segmentation
— Unverified 00 Smaller, Weaker, Yet Better: Training LLM Reasoners via Compute-Optimal Sampling Aug 29, 2024 Diversity Knowledge Distillation
— Unverified 00 Small Language Models are Equation Reasoners Sep 19, 2024 Arithmetic Reasoning Knowledge Distillation
— Unverified 00 Small Object Detection: A Comprehensive Survey on Challenges, Techniques and Real-World Applications Mar 26, 2025 Articles Data Augmentation
— Unverified 00 Small Vision-Language Models: A Survey on Compact Architectures and Techniques Mar 9, 2025 Computational Efficiency Knowledge Distillation
— Unverified 00 Smart Inference for Multidigit Convolutional Neural Network based Barcode Decoding Apr 14, 2020 Knowledge Distillation
— Unverified 00 SMOC-Net: Leveraging Camera Pose for Self-Supervised Monocular Object Pose Estimation Jan 1, 2023 6D Pose Estimation using RGB Knowledge Distillation
— Unverified 00 Smoothing Out Hallucinations: Mitigating LLM Hallucination with Smoothed Knowledge Distillation Feb 16, 2025 Hallucination Knowledge Distillation
— Unverified 00 SnapGen: Taming High-Resolution Text-to-Image Models for Mobile Devices with Efficient Architectures and Training Dec 12, 2024 Knowledge Distillation Text-to-Image Generation
— Unverified 00 SNN-PAR: Energy Efficient Pedestrian Attribute Recognition via Spiking Neural Networks Oct 10, 2024 Attribute Knowledge Distillation
— Unverified 00 SoccerKDNet: A Knowledge Distillation Framework for Action Recognition in Soccer Videos Jul 15, 2023 Action Recognition Knowledge Distillation
— Unverified 00 Soft Knowledge Distillation with Multi-Dimensional Cross-Net Attention for Image Restoration Models Compression Jan 16, 2025 Contrastive Learning Deblurring
— Unverified 00 Soft Prompt Decoding for Multilingual Dense Retrieval May 15, 2023 Cross-Lingual Information Retrieval Information Retrieval
— Unverified 00 Solvable Model for Inheriting the Regularization through Knowledge Distillation Dec 1, 2020 Knowledge Distillation Transfer Learning
— Unverified 00 SonoSAMTrack -- Segment and Track Anything on Ultrasound Images Oct 25, 2023 Knowledge Distillation
— Unverified 00 Sorbet: A Neuromorphic Hardware-Compatible Transformer-Based Spiking Language Model Sep 4, 2024 Knowledge Distillation Language Modeling
— Unverified 00