Real-time Spatio-temporal Action Localization via Learning Motion Representation Nov 30, 2020 Action Classification Action Localization
— Unverified 0ReasoningRank: Teaching Student Models to Rank through Reasoning-Based Knowledge Distillation Oct 7, 2024 Decision Making Information Retrieval
— Unverified 0Rebalancing Multi-Label Class-Incremental Learning Aug 22, 2024 class-incremental learning Class Incremental Learning
— Unverified 0Recalling The Forgotten Class Memberships: Unlearned Models Can Be Noisy Labelers to Leak Privacy Jun 24, 2025 Knowledge Distillation Learning with noisy labels
— Unverified 0Recent Advances in Direct Speech-to-text Translation Jun 20, 2023 Data Augmentation Decoder
— Unverified 0Recent Advances of Continual Learning in Computer Vision: An Overview Sep 23, 2021 Continual Learning Knowledge Distillation
— Unverified 0Membership Privacy for Machine Learning Models Through Knowledge Transfer Jun 15, 2019 BIG-bench Machine Learning General Classification
— Unverified 0Reconstructing Perceived Images from Brain Activity by Visually-guided Cognitive Representation and Adversarial Learning Jun 27, 2019 Generative Adversarial Network Image Reconstruction
— Unverified 0Rectified Decision Trees: Exploring the Landscape of Interpretable and Effective Machine Learning Aug 21, 2020 BIG-bench Machine Learning Knowledge Distillation
— Unverified 0Rectified Decision Trees: Towards Interpretability, Compression and Empirical Soundness Mar 14, 2019 Knowledge Distillation
— Unverified 0Rectifying the Data Bias in Knowledge Distillation Oct 11, 2021 Face Recognition Face Verification
— Unverified 0Recurrent knowledge distillation May 18, 2018 Knowledge Distillation
— Unverified 0Recurrent Stacking of Layers in Neural Networks: An Application to Neural Machine Translation Jun 18, 2021 Knowledge Distillation Machine Translation
— Unverified 0Redistributing Low-Frequency Words: Making the Most of Monolingual Data in Non-Autoregressive Translation Nov 16, 2021 Knowledge Distillation Translation
— Unverified 0Reducing the gap between streaming and non-streaming Transducer-based ASR by adaptive two-stage knowledge distillation Jun 27, 2023 Knowledge Distillation speech-recognition
— Unverified 0Reducing the Teacher-Student Gap via Adaptive Temperatures Sep 29, 2021 Knowledge Distillation
— Unverified 0RefBERT: Compressing BERT by Referencing to Pre-computed Representations Jun 11, 2021 Knowledge Distillation
— Unverified 0Referee: Reference-Free Sentence Summarization with Sharper Controllability through Symbolic Knowledge Distillation Oct 25, 2022 Knowledge Distillation Sentence
— Unverified 0Refine and Distill: Exploiting Cycle-Inconsistency and Knowledge Distillation for Unsupervised Monocular Depth Estimation Mar 11, 2019 Depth Estimation Depth Prediction
— Unverified 0Region-aware Knowledge Distillation for Efficient Image-to-Image Translation May 25, 2022 Contrastive Learning image-classification
— Unverified 0Regression Bugs Are In Your Model! Measuring, Reducing and Analyzing Regressions In NLP Model Updates May 7, 2021 Knowledge Distillation model
— Unverified 0Reinforced Iterative Knowledge Distillation for Cross-Lingual Named Entity Recognition Jun 1, 2021 Cross-Lingual NER Knowledge Distillation
— Unverified 0Reinforced Multi-Teacher Selection for Knowledge Distillation Dec 11, 2020 GPU Knowledge Distillation
— Unverified 0Relational Subsets Knowledge Distillation for Long-tailed Retinal Diseases Recognition Apr 22, 2021 Knowledge Distillation
— Unverified 0Relation Modeling and Distillation for Learning with Noisy Labels May 30, 2024 Contrastive Learning Knowledge Distillation
— Unverified 0Relaxed Recursive Transformers: Effective Parameter Sharing with Layer-wise LoRA Oct 28, 2024 Knowledge Distillation
— Unverified 0Remembering Transformer for Continual Learning Apr 11, 2024 Continual Learning Knowledge Distillation
— Unverified 0Remining Hard Negatives for Generative Pseudo Labeled Domain Adaptation Jan 24, 2025 Domain Adaptation Information Retrieval
— Unverified 0Remote Sensing Image Classification with Decoupled Knowledge Distillation May 25, 2025 Classification image-classification
— Unverified 0Removing Rain Streaks via Task Transfer Learning Aug 28, 2022 Knowledge Distillation Rain Removal
— Unverified 0Representation Consolidation from Multiple Expert Teachers Sep 29, 2021 Knowledge Distillation
— Unverified 0Representation Disparity-aware Distillation for 3D Object Detection Aug 20, 2023 3D Object Detection Knowledge Distillation
— Unverified 0Representation Transfer by Optimal Transport Jul 13, 2020 Knowledge Distillation Model Compression
— Unverified 0Research on Multilingual News Clustering Based on Cross-Language Word Embeddings May 30, 2023 Clustering Knowledge Distillation
— Unverified 0Research on the Online Update Method for Retrieval-Augmented Generation (RAG) Model with Incremental Learning Jan 13, 2025 Incremental Learning Knowledge Distillation
— Unverified 0Residual Knowledge Distillation Feb 21, 2020 Knowledge Distillation Model Compression
— Unverified 0ResKD: Residual-Guided Knowledge Distillation Jun 8, 2020 Knowledge Distillation
— Unverified 0Resolution-Based Distillation for Efficient Histology Image Classification Jan 11, 2021 Classification Computational Efficiency
— Unverified 0Resource-Efficient Beam Prediction in mmWave Communications with Multimodal Realistic Simulation Framework Apr 7, 2025 Autonomous Driving Beam Prediction
— Unverified 0REFT: Resource-Efficient Federated Training Framework for Heterogeneous and Resource-Constrained Environments Aug 25, 2023 Federated Learning image-classification
— Unverified 0Respecting Transfer Gap in Knowledge Distillation Oct 23, 2022 Knowledge Distillation
— Unverified 0Response-based Distillation for Incremental Object Detection Oct 26, 2021 Incremental Learning Knowledge Distillation
— Unverified 0Rethinking Attention: Exploring Shallow Feed-Forward Neural Networks as an Alternative to Attention Layers in Transformers Nov 17, 2023 Knowledge Distillation
— Unverified 0Rethinking Attention Mechanism in Time Series Classification Jul 14, 2022 Classification Knowledge Distillation
— Unverified 0Rethinking Feature-Based Knowledge Distillation for Face Recognition Jan 1, 2023 Face Recognition GPU
— Unverified 0Rethinking Invariance Regularization in Adversarial Training to Improve Robustness-Accuracy Trade-off Feb 22, 2024 Adversarial Defense Knowledge Distillation
— Unverified 0Rethinking Knowledge Distillation via Cross-Entropy Aug 22, 2022 Knowledge Distillation
— Unverified 0Rethinking Knowledge in Distillation: An In-context Sample Retrieval Perspective Jan 13, 2025 Knowledge Distillation Retrieval
— Unverified 0Rethinking Position Bias Modeling with Knowledge Distillation for CTR Prediction Apr 1, 2022 Click-Through Rate Prediction Knowledge Distillation
— Unverified 0Rethinking Soft Labels for Knowledge Distillation: A Bias–Variance Tradeoff Perspective Jan 1, 2021 Knowledge Distillation
— Unverified 0