Multi-Armed Bandits with Local Differential Privacy Jul 6, 2020 Multi-Armed Bandits
— Unverified 00 Multi-Armed Bandits With Machine Learning-Generated Surrogate Rewards Jun 20, 2025 Decision Making Under Uncertainty Multi-Armed Bandits
— Unverified 00 Multi-Armed Bandits with Metric Movement Costs Oct 24, 2017 Multi-Armed Bandits
— Unverified 00 Multi-Armed Bandits with Self-Information Rewards Sep 6, 2022 Multi-Armed Bandits
— Unverified 00 Multi-Fidelity Multi-Armed Bandits Revisited Jun 13, 2023 Multi-Armed Bandits
— Unverified 00 Multilinguality in LLM-Designed Reward Functions for Restless Bandits: Effects on Task Performance and Fairness Jan 20, 2025 Fairness Multi-Armed Bandits
— Unverified 00 Multinomial Logit Contextual Bandits: Provable Optimality and Practicality Mar 25, 2021 Multi-Armed Bandits
— Unverified 00 Multi-Objective Generalized Linear Bandits May 30, 2019 Multi-Armed Bandits
— Unverified 00 Multi-Player Approaches for Dueling Bandits May 25, 2024 Multi-Armed Bandits
— Unverified 00 Multi-Player Bandits: A Trekking Approach Sep 17, 2018 Multi-Armed Bandits
— Unverified 00 Multi-Player Bandits Revisited Nov 7, 2017 Multi-Armed Bandits
— Unverified 00 Multi-Player Bandits Robust to Adversarial Collisions Nov 15, 2022 Multi-Armed Bandits
— Unverified 00 Multiplayer Information Asymmetric Contextual Bandits Mar 11, 2025 Multi-Armed Bandits
— Unverified 00 Multi-player Multi-armed Bandits with Collision-Dependent Reward Distributions Jun 25, 2021 Multi-Armed Bandits
— Unverified 00 Multi-Player Multi-Armed Bandits with Finite Shareable Resources Arms: Learning Algorithms & Applications Apr 28, 2022 Edge-computing Multi-Armed Bandits
— Unverified 00 Decentralized Heterogeneous Multi-Player Multi-Armed Bandits with Non-Zero Rewards on Collisions Oct 21, 2019 Multi-Armed Bandits
— Unverified 00 Multiple-Play Stochastic Bandits with Shareable Finite-Capacity Arms Jun 17, 2022 Multi-Armed Bandits
— Unverified 00 Multiplier Bootstrap-based Exploration Feb 3, 2023 Multi-Armed Bandits
— Unverified 00 MultiScale Contextual Bandits for Long Term Objectives Mar 22, 2025 Multi-Armed Bandits Recommendation Systems
— Unverified 00 Multi-Statistic Approximate Bayesian Computation with Multi-Armed Bandits May 22, 2018 Feature Engineering Multi-Armed Bandits
— Unverified 00 Multi-Task Learning for Contextual Bandits May 24, 2017 Multi-Armed Bandits Multi-Task Learning
— Unverified 00 Multi-User MABs with User Dependent Rewards for Uncoordinated Spectrum Access Oct 21, 2019 Multi-Armed Bandits
— Unverified 00 Multi-User Multi-Armed Bandits for Uncoordinated Spectrum Access Jul 2, 2018 Multi-Armed Bandits
— Unverified 00 Navigating the Rashomon Effect: How Personalization Can Help Adjust Interpretable Machine Learning Models to Individual Users May 11, 2025 Additive models Interpretable Machine Learning
— Unverified 00 Nearest Neighbor Search Under Uncertainty Mar 8, 2021 Multi-Armed Bandits Representation Learning
— Unverified 00 Nearly Minimax-Optimal Regret for Linearly Parameterized Bandits Mar 30, 2019 Multi-Armed Bandits
— Unverified 00 Nearly Optimal Algorithms for Linear Contextual Bandits with Adversarial Corruptions May 13, 2022 Multi-Armed Bandits
— Unverified 00 Nearly-Optimal Bandit Learning in Stackelberg Games with Side Information Jan 31, 2025 Multi-Armed Bandits
— Unverified 00 Towards a Sharp Analysis of Offline Policy Learning for f-Divergence-Regularized Contextual Bandits Feb 9, 2025 Multi-Armed Bandits
— Unverified 00 Nearly Optimal Sampling Algorithms for Combinatorial Pure Exploration Jun 4, 2017 Multi-Armed Bandits
— Unverified 00 Nearly-tight Approximation Guarantees for the Improving Multi-Armed Bandits Problem Apr 1, 2024 Multi-Armed Bandits
— Unverified 00 Nearly Tight Bounds for Cross-Learning Contextual Bandits with Graphical Feedback Feb 7, 2025 Multi-Armed Bandits
— Unverified 00 Nearly Tight Bounds for Exploration in Streaming Multi-armed Bandits with Known Optimality Gap Feb 3, 2025 Multi-Armed Bandits
— Unverified 00 Near Optimal Best Arm Identification for Clustered Bandits May 15, 2025 Clustering Computational Efficiency
— Unverified 00 Near-Optimal Private Learning in Linear Contextual Bandits Feb 18, 2025 Multi-Armed Bandits
— Unverified 00 Networked Restless Multi-Armed Bandits for Mobile Interventions Jan 28, 2022 Multi-Armed Bandits
— Unverified 00 Networked Stochastic Multi-Armed Bandits with Combinatorial Strategies Mar 20, 2015 Multi-Armed Bandits
— Unverified 00 Neural Bandit with Arm Group Graph Jun 8, 2022 Multi-Armed Bandits
— Unverified 00 Neural Collaborative Filtering Bandits via Meta Learning Jan 31, 2022 Collaborative Filtering Decision Making
— Unverified 00 Neural Contextual Bandits Based Dynamic Sensor Selection for Low-Power Body-Area Networks May 24, 2022 Anomaly Detection Multi-Armed Bandits
— Unverified 00 Neural Contextual Bandits for Personalized Recommendation Dec 21, 2023 Multi-Armed Bandits Recommendation Systems
— Unverified 00 Neural Contextual Bandits Under Delayed Feedback Constraints Apr 16, 2025 Multi-Armed Bandits Recommendation Systems
— Unverified 00 Reward-Biased Maximum Likelihood Estimation for Neural Contextual Bandits Mar 8, 2022 Multi-Armed Bandits
— Unverified 00 Neural Contextual Bandits with Deep Representation and Shallow Exploration Dec 3, 2020 Multi-Armed Bandits Representation Learning
— Unverified 00 Neural Network Retraining for Model Serving Apr 29, 2020 model Multi-Armed Bandits
— Unverified 00 Neural Risk-sensitive Satisficing in Contextual Bandits Jan 15, 2025 Multi-Armed Bandits Recommendation Systems
— Unverified 00 NeuralUCB: Contextual Bandits with Neural Network-Based Exploration Sep 25, 2019 Efficient Exploration Multi-Armed Bandits
— Unverified 00 No DBA? No regret! Multi-armed bandits for index tuning of analytical and HTAP workloads with provable guarantees Aug 23, 2021 Decision Making Decision Making Under Uncertainty
— Unverified 00 Nonlinear Sequential Accepts and Rejects for Identification of Top Arms in Stochastic Bandits Jul 9, 2017 Multi-Armed Bandits
— Unverified 00 Nonparametric Contextual Bandits in an Unknown Metric Space Aug 3, 2019 Multi-Armed Bandits
— Unverified 00