# Optimized first-order methods for smooth convex minimization

@article{Kim2016OptimizedFM, title={Optimized first-order methods for smooth convex minimization}, author={Donghwan Kim and Jeffrey A. Fessler}, journal={Mathematical Programming}, year={2016}, volume={159}, pages={81-107} }

We introduce new optimized first-order methods for smooth unconstrained convex minimization. Drori and Teboulle (Math Program 145(1–2):451–482, 2014. doi:10.1007/s10107-013-0653-0) recently described a numerical method for computing the N-iteration optimal step coefficients in a class of first-order algorithms that includes gradient methods, heavy-ball methods (Polyak in USSR Comput Math Math Phys 4(5):1–17, 1964. doi:10.1016/0041-5553(64)90137-5), and Nesterov’s fast gradient methods (Nesterov… Expand

#### Topics from this paper

#### 114 Citations

Efficient first-order methods for convex minimization: a constructive approach

- Mathematics, Computer Science
- Math. Program.
- 2020

We describe a novel constructive technique for devising efficient first-order methods for a wide range of large-scale convex minimization settings, including smooth, non-smooth, and strongly convex… Expand

An optimal gradient method for smooth (possibly strongly) convex minimization

- Computer Science
- ArXiv
- 2021

We present an optimal gradient method for smooth (possibly strongly) convex optimization. The method is optimal in the sense that its worst-case bound exactly matches the lower bound on the oracle… Expand

Smooth strongly convex interpolation and exact worst-case performance of first-order methods

- Mathematics, Computer Science
- Math. Program.
- 2017

We show that the exact worst-case performance of fixed-step first-order methods for unconstrained optimization of smooth (possibly strongly) convex functions can be obtained by solving convex… Expand

An optimized first-order method for image restoration

- Mathematics, Computer Science
- 2015 IEEE International Conference on Image Processing (ICIP)
- 2015

The convergence analysis of OGM is discussed and its fast convergence on an image restoration problem using a smoothed total variation (TV) regularizer is explored and its extension to nonsmooth convex minimization for image restoration with l1-sparsity regularization is investigated. Expand

Analysis of Optimization Algorithms via Sum-of-Squares

- Mathematics, Computer Science
- J. Optim. Theory Appl.
- 2021

A new framework for unifying and systematizing the performance analysis of first-order black-box optimization algorithms for unconstrained convex minimization is introduced, based on sum-of-squares optimization, which allows to introduce a hierarchy of semidefinite programs (SDPs) that give increasingly better convergence bounds for higher levels of the hierarchy. Expand

Part 2. Gradient and Subgradient Methods for Unconstrained Convex Optimization

- 2018

This note studies (sub)gradient methods for unconstrained convex optimization. Many parts of this note are based on the chapters [1, Chapter 4] [2, Chapter 3,5,8,10] [5, Chapter 9] [14, Chapter 2,3]… Expand

Convex interpolation and performance estimation of first-order methods for convex optimization

- Computer Science, Mathematics
- 2017

This thesis forms a generic optimization problem looking for the worst-case scenarios of first-order methods in convex optimization, and transforms PEPs into solvable finite-dimensional semidefinite programs, from which one obtains worst- Case guarantees and worst- case functions, along with the corresponding explicit proofs. Expand

Adaptive Restart of the Optimized Gradient Method for Convex Optimization

- Mathematics, Computer Science
- J. Optim. Theory Appl.
- 2018

A new first-order method is derived that resembles the optimized gradient method for strongly convex quadratic problems with known function parameters, yielding a linear convergence rate that is faster than that of the analogous version of the fast gradient method. Expand

A Unifying Framework of Accelerated First-Order Approach to Strongly Monotone Variational Inequalities

- Mathematics
- 2021

In this paper, we propose a unifying framework incorporating several momentum-related search directions for solving strongly monotone variational inequalities. The specific combinations of the search… Expand

On the Convergence Analysis of the Optimized Gradient Method

- Mathematics, Medicine
- J. Optim. Theory Appl.
- 2017

This paper provides an analytic convergence bound for the primary sequence generated by the optimized gradient method, including the interesting fact that the optimization method has two types of worst-case functions: a piecewise affine-quadratic function and a quadratic function. Expand

#### References

SHOWING 1-10 OF 30 REFERENCES

Performance of first-order methods for smooth convex minimization: a novel approach

- Computer Science, Mathematics
- Math. Program.
- 2014

A novel approach for analyzing the worst-case performance of first-order black-box optimization methods, which focuses on smooth unconstrained convex minimization over the Euclidean space and derives a new and tight analytical bound on its performance. Expand

Smooth strongly convex interpolation and exact worst-case performance of first-order methods

- Mathematics, Computer Science
- Math. Program.
- 2017

We show that the exact worst-case performance of fixed-step first-order methods for unconstrained optimization of smooth (possibly strongly) convex functions can be obtained by solving convex… Expand

An optimized first-order method for image restoration

- Mathematics, Computer Science
- 2015 IEEE International Conference on Image Processing (ICIP)
- 2015

The convergence analysis of OGM is discussed and its fast convergence on an image restoration problem using a smoothed total variation (TV) regularizer is explored and its extension to nonsmooth convex minimization for image restoration with l1-sparsity regularization is investigated. Expand

Gradient methods for minimizing composite objective function

- Mathematics
- 2007

In this paper we analyze several new methods for solving optimization problems with the objective function formed as a sum of two convex terms: one is smooth and given by a black-box oracle, and… Expand

Optimized first-order methods for smooth convex minimization

- Mathematics
- 2016

We introduce new optimized first-order methods for smooth unconstrained convex minimization. Drori and Teboulle (Math Program 145(1---2):451---482, 2014. doi:10.1007/s10107-013-0653-0) recently des...

Gradient methods for minimizing composite functions

- Mathematics, Computer Science
- Math. Program.
- 2013

In this paper we analyze several new methods for solving optimization problems with the objective function formed as a sum of two terms: one is smooth and given by a black-box oracle, and another is… Expand

Some methods of speeding up the convergence of iteration methods

- Mathematics
- 1964

Abstract For the solution of the functional equation P (x) = 0 (1) (where P is an operator, usually linear, from B into B, and B is a Banach space) iteration methods are generally used. These consist… Expand

A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems

- Mathematics, Computer Science
- SIAM J. Imaging Sci.
- 2009

A new fast iterative shrinkage-thresholding algorithm (FISTA) which preserves the computational simplicity of ISTA but with a global rate of convergence which is proven to be significantly better, both theoretically and practically. Expand

Introductory Lectures on Convex Optimization - A Basic Course

- Computer Science
- Applied Optimization
- 2004

It was in the middle of the 1980s, when the seminal paper by Kar markar opened a new epoch in nonlinear optimization, and it became more and more common that the new methods were provided with a complexity analysis, which was considered a better justification of their efficiency than computational experiments. Expand

Lectures on modern convex optimization - analysis, algorithms, and engineering applications

- Computer Science, Mathematics
- MPS-SIAM series on optimization
- 2001

The authors present the basic theory of state-of-the-art polynomial time interior point methods for linear, conic quadratic, and semidefinite programming as well as their numerous applications in engineering. Expand