Scalable Syntax-Aware Language Models Using Knowledge Distillation Jun 14, 2019 Knowledge Distillation Language Modeling
— Unverified 0Scale-Equivalent Distillation for Semi-Supervised Object Detection Mar 23, 2022 Knowledge Distillation Object
— Unverified 0ScaleKD: Distilling Scale-Aware Knowledge in Small Object Detector Jan 1, 2023 Knowledge Distillation object-detection
— Unverified 0ScaleOT: Privacy-utility-scalable Offsite-tuning with Dynamic LayerReplace and Selective Rank Compression Dec 13, 2024 Knowledge Distillation Privacy Preserving
— Unverified 0Scaling Fair Learning to Hundreds of Intersectional Groups Sep 29, 2021 Attribute Fairness
— Unverified 0Scaling Large Vision-Language Models for Enhanced Multimodal Comprehension In Biomedical Image Analysis Jan 26, 2025 Articles Hallucination
— Unverified 0Scaling Laws for Data-Efficient Visual Transfer Learning Apr 17, 2025 Knowledge Distillation Transfer Learning
— Unverified 0Scaling of Search and Learning: A Roadmap to Reproduce o1 from Reinforcement Learning Perspective Dec 18, 2024 Knowledge Distillation
— Unverified 0SCARF: Scalable Continual Learning Framework for Memory-efficient Multiple Neural Radiance Fields Sep 6, 2024 Continual Learning Knowledge Distillation
— Unverified 0Scavenging Hyena: Distilling Transformers into Long Convolution Models Jan 31, 2024 Knowledge Distillation
— Unverified 0Scene-adaptive and Region-aware Multi-modal Prompt for Open Vocabulary Object Detection Jan 1, 2024 Knowledge Distillation object-detection
— Unverified 0Scene-adaptive Knowledge Distillation for Sequential Recommendation via Differentiable Architecture Search Jul 15, 2021 Knowledge Distillation Neural Architecture Search
— Unverified 0Scene-aware Human Pose Generation using Transformer Aug 4, 2023 Knowledge Distillation Scene Understanding
— Unverified 0Scene Graph Aided Radiology Report Generation Mar 8, 2024 Decoder Knowledge Distillation
— Unverified 0Scheduled Knowledge Acquisition on Lightweight Vector Symbolic Architectures for Brain-Computer Interfaces Mar 18, 2024 Feature Engineering Knowledge Distillation
— Unverified 0Sci-CoT: Leveraging Large Language Models for Enhanced Knowledge Distillation in Small Models for Scientific QA Aug 9, 2023 ARC Knowledge Distillation
— Unverified 0SCLIFD:Supervised Contrastive Knowledge Distillation for Incremental Fault Diagnosis under Limited Fault Data Feb 12, 2023 class-incremental learning Class Incremental Learning
— Unverified 0SDBERT: SparseDistilBERT, a faster and smaller BERT model Jul 28, 2022 Knowledge Distillation
— Unverified 0SDDGR: Stable Diffusion-based Deep Generative Replay for Class Incremental Object Detection Feb 27, 2024 class-incremental learning Class Incremental Learning
— Unverified 0SDQ: Stochastic Differentiable Quantization with Mixed Precision Jun 9, 2022 Knowledge Distillation Neural Architecture Search
— Unverified 0Search for Better Students to Learn Distilled Knowledge Jan 30, 2020 Knowledge Distillation Model Compression
— Unverified 0Searching for COMETINHO: The Little Metric That Could Jun 1, 2022 Computational Efficiency Knowledge Distillation
— Unverified 0Search to Distill: Pearls are Everywhere but not the Eyes Nov 20, 2019 Ensemble Learning Face Recognition
— Unverified 0SeCoKD: Aligning Large Language Models for In-Context Learning with Fewer Shots Jun 20, 2024 In-Context Learning Knowledge Distillation
— Unverified 0Secost: Sequential co-supervision for large scale weakly labeled audio event detection Oct 25, 2019 Event Detection Knowledge Distillation
— Unverified 0Secure Your Ride: Real-time Matching Success Rate Prediction for Passenger-Driver Pairs Sep 14, 2021 Decision Making Knowledge Distillation
— Unverified 0SEDD-PCC: A Single Encoder-Dual Decoder Framework For End-To-End Learned Point Cloud Compression May 22, 2025 Attribute Decoder
— Unverified 0Segment Any RGB-Thermal Model with Language-aided Distillation May 4, 2025 Instance Segmentation Knowledge Distillation
— Unverified 0SEKI: Self-Evolution and Knowledge Inspiration based Neural Architecture Search via Large Language Models Feb 27, 2025 GPU Knowledge Distillation
— Unverified 0Select and Distill: Selective Dual-Teacher Knowledge Transfer for Continual Learning on Vision-Language Models Mar 14, 2024 Continual Learning Knowledge Distillation
— Unverified 0Selecting Related Knowledge via Efficient Channel Attention for Online Continual Learning Sep 9, 2022 Continual Learning Knowledge Distillation
— Unverified 0SelectiveKD: A semi-supervised framework for cancer detection in DBT through Knowledge Distillation and Pseudo-labeling Sep 25, 2024 Cancer Classification Knowledge Distillation
— Unverified 0Selective Knowledge Distillation for Non-Autoregressive Neural Machine Translation Mar 31, 2023 Knowledge Distillation Machine Translation
— Unverified 0Self-Cooperation Knowledge Distillation for Novel Class Discovery Jul 2, 2024 Knowledge Distillation Novel Class Discovery
— Unverified 0Self-Distillation Amplifies Regularization in Hilbert Space Feb 13, 2020 Knowledge Distillation L2 Regularization
— Unverified 0Self-Distillation Learning Based on Temporal-Spatial Consistency for Spiking Neural Networks Jun 12, 2024 Knowledge Distillation
— Unverified 0Self-Distillation Mixup Training for Non-autoregressive Neural Machine Translation Dec 22, 2021 Knowledge Distillation Machine Translation
— Unverified 0Self-distillation with Batch Knowledge Ensembling Improves ImageNet Classification Apr 27, 2021 Classification General Classification
— Unverified 0Self-Distilled Pruning Of Neural Networks Sep 29, 2021 Knowledge Distillation Language Modeling
— Unverified 0Self-Distilled Pruning of Neural Networks Nov 16, 2021 Knowledge Distillation Language Modeling
— Unverified 0Self-Evolution Knowledge Distillation for LLM-based Machine Translation Dec 19, 2024 Knowledge Distillation Machine Translation
— Unverified 0A New Training Framework for Deep Neural Network Mar 12, 2021 Knowledge Distillation
— Unverified 0SELF-KNOWLEDGE DISTILLATION ADVERSARIAL ATTACK Sep 25, 2019 Adversarial Attack Knowledge Distillation
— Unverified 0Self-Knowledge Distillation based Self-Supervised Learning for Covid-19 Detection from Chest X-Ray Images Jun 7, 2022 Knowledge Distillation Self-Knowledge Distillation
— Unverified 0Self-Knowledge Distillation for Learning Ambiguity Jun 14, 2024 Knowledge Distillation Natural Language Understanding
— Unverified 0Self-Knowledge Distillation for Surgical Phase Recognition Jun 15, 2023 Decoder Knowledge Distillation
— Unverified 0Self-Knowledge Distillation in Natural Language Processing Aug 2, 2019 Deep Learning Knowledge Distillation
— Unverified 0Self-Knowledge Distillation via Dropout Aug 11, 2022 Adversarial Robustness image-classification
— Unverified 0Self-Referenced Deep Learning Nov 19, 2018 Deep Learning Knowledge Distillation
— Unverified 0Self Regulated Learning Mechanism for Data Efficient Knowledge Distillation Feb 14, 2021 Knowledge Distillation Transfer Learning
— Unverified 0