SOTAVerified

Global Optimization with A Power-Transformed Objective and Gaussian Smoothing

2024-12-06Code Available0· sign in to hype

Chen Xu

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

We propose a novel method that solves global optimization problems in two steps: (1) perform a (exponential) power-N transformation to the not-necessarily differentiable objective function f and get f_N, and (2) optimize the Gaussian-smoothed f_N with stochastic approximations. Under mild conditions on f, for any >0, we prove that with a sufficiently large power N_, this method converges to a solution in the -neighborhood of f's global optimum point. The convergence rate is O(d^2^4^-2), which is faster than both the standard and single-loop homotopy methods if is pre-selected to be in (0,1). In most of the experiments performed, our method produces better solutions than other algorithms that also apply smoothing techniques.

Tasks

Reproductions