SOTAVerified

First-order methods for stochastic and finite-sum convex optimization with deterministic constraints

2025-06-25Unverified0· sign in to hype

Zhaosong Lu, Yifeng Xiao

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

In this paper, we study a class of stochastic and finite-sum convex optimization problems with deterministic constraints. Existing methods typically aim to find an -expectedly\ feasible\ stochastic\ optimal solution, in which the expected constraint violation and expected optimality gap are both within a prescribed tolerance . However, in many practical applications, constraints must be nearly satisfied with certainty, rendering such solutions potentially unsuitable due to the risk of substantial violations. To address this issue, we propose stochastic first-order methods for finding an -surely\ feasible\ stochastic\ optimal (-SFSO) solution, where the constraint violation is deterministically bounded by and the expected optimality gap is at most . Our methods apply an accelerated stochastic gradient (ASG) scheme or a modified variance-reduced ASG scheme only\ once to a sequence of quadratic penalty subproblems with appropriately chosen penalty parameters. We establish first-order oracle complexity bounds for the proposed methods in computing an -SFSO solution. As a byproduct, we also derive first-order oracle complexity results for sample average approximation method in computing an -SFSO solution of the stochastic optimization problem using our proposed methods to solve the sample average problem.

Tasks

Reproductions