Publications

Performance Estimation for Smooth and Strongly Convex Sets

Published in ArXiv Preprint, 2024

This work extends recent computer-assisted design and analysis techniques for first-order optimization over structured functions–known as performance estimation–to apply to structured sets. We prove “interpolation theorems” for smooth and strongly convex sets with Slater points and bounded diameter, showing a wide range of extremal questions amount to structured mathematical programs. Our theory provides finite-dimensional formulations of performance estimation problems for algorithms utilizing separating hyperplane oracles, linear optimization oracles, and/or projection oracles of smooth/strongly convex sets, and we demonstrate its applications.

Recommended citation: Alan Luner, Benjamin Grimmer. (2024). "Performance Estimation for Smooth and Strongly Convex Sets." ArXiv Preprint 2410.14811 .
Download Paper

On Averaging and Extrapolation for Gradient Descent

Published in ArXiv Preprint, 2024

This work considers the effect of averaging and extrapolation of the iterates of gradient descent in smooth convex optimization. We show that for several common stepsize sequences, averaging cannot improve gradient descent’s worst-case performance. In contrast, we prove a conceptually simple and computationally cheap extrapolation scheme strictly improves the worst-case convergence rate: when initialized at the origin, reporting \((1+1/\sqrt{16N\log(N)})x_N\) rather than \(x_N\) improves the best possible worst-case performance by the same amount as conducting \(O(\sqrt{N/\log(N)})\) more gradient steps.

Recommended citation: Alan Luner, Benjamin Grimmer. (2024). "On Averaging and Extrapolation for Gradient Descent." ArXiv Preprint 2402.12493.
Download Paper