Superfast Second-Order Methods for Unconstrained Convex Optimization

Yurii Nesterov (University of Louvain)

29-Jun-2020, 13:00-14:00 (4 years ago)

Abstract: In this talk, we present new second-order methods with convergence rate $O( 1/k^4)$, where $k$ is the iteration counter. This is faster that the existing lower bound for this type of schemes, which is $O ( 1/k^{7/2} )$. Our progress can be explained by a finer specification of the problem class. The main idea of this approach consists in implementation of an accelerated third-order scheme using a second-order oracle. At each iteration of our method, we solve a nontrivial auxiliary problem by a linearly convergent scheme based on the relative non-degeneracy condition. During this process, the Hessian of the objective function is computed once, and the gradient is computed $O (\ln {1 \over \epsilon})$ times, where $\epsilon$ is the desired accuracy of the solution for our problem.

optimization and control

Audience: general audience

( paper )

Comments: the address and password of the zoom room of the seminar are sent by e-mail on the mailinglist of the seminar one day before each talk


One World Optimization seminar

Series comments: Description: Online seminar on optimization and related areas

The address and password of the zoom room of the seminar are sent by e-mail on the mailinglist of the seminar one day before each talk

Organizers: Sorin-Mihai Grad*, Radu Ioan BoČ›, Shoham Sabach, Mathias Staudigl
*contact for this listing

Export talk to