Global Convergence of Multiplicative Updates for the Matrix Mechanism: A Collaborative Proof with Gemini 3
Keith Rush
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
We analyze a fixed-point iteration v ϕ(v) arising in the optimization of a regularized nuclear norm objective involving the Hadamard product structure, posed in~denisov in the context of an optimization problem over the space of algorithms in private machine learning. We prove that the iteration v^(k+1) = diag((D_v^(k)^1/2 M D_v^(k)^1/2)^1/2) converges monotonically to the unique global optimizer of the potential function J(v) = 2 Tr((D_v^1/2 M D_v^1/2)^1/2) - v_i, closing a problem left open there. The bulk of this proof was provided by Gemini 3, subject to some corrections and interventions. Gemini 3 also sketched the initial version of this note. Thus, it represents as much a commentary on the practical use of AI in mathematics as it represents the closure of a small gap in the literature. As such, we include a small narrative description of the prompting process, and some resulting principles for working with AI to prove mathematics.