Efficient Resource Allocation with Fairness Constraints in Restless Multi-Armed Bandits Jun 8, 2022 Decision Making Fairness
— Unverified 0Efficient Training of Multi-task Combinarotial Neural Solver with Multi-armed Bandits May 10, 2023 Combinatorial Optimization Decoder
— Unverified 0Empathic Responding for Digital Interpersonal Emotion Regulation via Content Recommendation Aug 5, 2024 Multi-Armed Bandits
— Unverified 0ε-Neural Thompson Sampling of Deep Brain Stimulation for Parkinson Disease Treatment Mar 11, 2024 Multi-Armed Bandits Reinforcement Learning (RL)
— Unverified 0Ensemble Active Learning by Contextual Bandits for AI Incubation in Manufacturing Oct 10, 2023 Active Learning Decision Making
— Unverified 0Episodic Multi-armed Bandits Aug 4, 2015 Multi-Armed Bandits Reinforcement Learning
— Unverified 0Epsilon-Best-Arm Identification in Pay-Per-Reward Multi-Armed Bandits Dec 1, 2019 Multi-Armed Bandits
— Unverified 0Equipping Experts/Bandits with Long-term Memory May 30, 2019 Multi-Armed Bandits
— Unverified 0Estimating Optimal Policy Value in General Linear Contextual Bandits Feb 19, 2023 Model Selection Multi-Armed Bandits
— Unverified 0Estimation Considerations in Contextual Bandits Nov 19, 2017 Causal Inference Econometrics
— Unverified 0From Predictions to Decisions: The Importance of Joint Predictive Distributions Jul 20, 2021 Multi-Armed Bandits Thompson Sampling
— Unverified 0Evolution of Information in Interactive Decision Making: A Case Study for Multi-Armed Bandits Mar 1, 2025 Decision Making Multi-Armed Bandits
— Unverified 0EVOLvE: Evaluating and Optimizing LLMs For Exploration Oct 8, 2024 Decision Making Under Uncertainty Multi-Armed Bandits
— Unverified 0Expanding on Repeated Consumer Search Using Multi-Armed Bandits and Secretaries Dec 22, 2020 Multi-Armed Bandits
— Unverified 0Expected Improvement-based Contextual Bandits Sep 29, 2021 Bayesian Optimization Multi-Armed Bandits
— Unverified 0Explicit Best Arm Identification in Linear Bandits Using No-Regret Learners Jun 13, 2020 Multi-Armed Bandits
— Unverified 0Exploration, Exploitation, and Engagement in Multi-Armed Bandits with Abandonment May 26, 2022 Multi-Armed Bandits Q-Learning
— Unverified 0Exploration Potential Sep 16, 2016 Multi-Armed Bandits reinforcement-learning
— Unverified 0Exploration Through Bias: Revisiting Biased Maximum Likelihood Estimation in Stochastic Multi-Armed Bandits Jan 1, 2020 Multi-Armed Bandits
— Unverified 0Exploration vs Exploitation vs Safety: Risk-averse Multi-Armed Bandits Jan 6, 2014 energy management Management
— Unverified 0Exploration with Limited Memory: Streaming Algorithms for Coin Tossing, Noisy Comparisons, and Multi-Armed Bandits Apr 9, 2020 Multi-Armed Bandits
— Unverified 0Exponentiated Gradient LINUCB for Contextual Multi-Armed Bandits May 10, 2013 Multi-Armed Bandits
— Unverified 0Exposure-Aware Recommendation using Contextual Bandits Sep 4, 2022 Multi-Armed Bandits Recommendation Systems
— Unverified 0Fair Algorithms for Infinite and Contextual Bandits Oct 29, 2016 Fairness Multi-Armed Bandits
— Unverified 0Fair Algorithms for Multi-Agent Multi-Armed Bandits Jul 13, 2020 Fairness Multi-Armed Bandits
— Unverified 0Bandit Learning with Delayed Impact of Actions Feb 24, 2020 Fairness Multi-Armed Bandits
— Unverified 0Fair Contextual Multi-Armed Bandits: Theory and Experiments Dec 13, 2019 Decision Making Fairness
— Unverified 0Fair Exploration via Axiomatic Bargaining Jun 4, 2021 Fairness Multi-Armed Bandits
— Unverified 0Fairness and Privacy Guarantees in Federated Contextual Bandits Feb 5, 2024 Fairness Federated Learning
— Unverified 0Fairness and Welfare Quantification for Regret in Multi-Armed Bandits May 27, 2022 Fairness Multi-Armed Bandits
— Unverified 0Fairness for Workers Who Pull the Arms: An Index Based Policy for Allocation of Restless Bandit Tasks Mar 1, 2023 Fairness Multi-Armed Bandits
— Unverified 0Fairness in Learning: Classic and Contextual Bandits May 23, 2016 Fairness Multi-Armed Bandits
— Unverified 0Fairness of Exposure in Stochastic Bandits Mar 3, 2021 Fairness Multi-Armed Bandits
— Unverified 0Falsification of Multiple Requirements for Cyber-Physical Systems Using Online Generative Adversarial Networks and Multi-Armed Bandits May 23, 2022 Multi-Armed Bandits
— Unverified 0Fast and Sample Efficient Multi-Task Representation Learning in Stochastic Contextual Bandits Oct 2, 2024 Multi-Armed Bandits Multi-Task Learning
— Unverified 0Faster Maximum Inner Product Search in High Dimensions Dec 14, 2022 Multi-Armed Bandits Recommendation Systems
— Unverified 0Faster Q-Learning Algorithms for Restless Bandits Sep 6, 2024 Multi-Armed Bandits Q-Learning
— Unverified 0Fast UCB-type algorithms for stochastic bandits with heavy and super heavy symmetric noise Feb 10, 2024 Multi-Armed Bandits
— Unverified 0Federated Combinatorial Multi-Agent Multi-Armed Bandits May 9, 2024 Combinatorial Optimization Data Summarization
— Unverified 0Federated Linear Bandits with Finite Adversarial Actions Nov 2, 2023 Multi-Armed Bandits
— Unverified 0Federated Linear Contextual Bandits Oct 27, 2021 Multi-Armed Bandits
— Unverified 0Federated Linear Contextual Bandits with Heterogeneous Clients Feb 29, 2024 All Federated Learning
— Unverified 0Federated Linear Contextual Bandits with User-level Differential Privacy Jun 8, 2023 Decision Making Multi-Armed Bandits
— Unverified 0Federated Multi-Armed Bandits Under Byzantine Attacks May 9, 2022 Data Poisoning Decision Making
— Unverified 0Federated Online Sparse Decision Making Feb 27, 2022 Decision Making Multi-Armed Bandits
— Unverified 0Federated Learning for Heterogeneous Bandits with Unobserved Contexts Mar 29, 2023 Federated Learning Multi-Armed Bandits
— Unverified 0FedMABA: Towards Fair Federated Learning through Multi-Armed Bandits Allocation Oct 26, 2024 Fairness Federated Learning
— Unverified 0Feel-Good Thompson Sampling for Contextual Bandits and Reinforcement Learning Oct 2, 2021 Multi-Armed Bandits regression
— Unverified 0Feel-Good Thompson Sampling for Contextual Dueling Bandits Apr 9, 2024 Decision Making Multi-Armed Bandits
— Unverified 0Field Study in Deploying Restless Multi-Armed Bandits: Assisting Non-Profits in Improving Maternal and Child Health Sep 16, 2021 Multi-Armed Bandits
— Unverified 0