Second-Order Methods for Nonconvex Optimization with Complexity Guarantees

Stephen Wright (University of Wisconsin)

28-Sep-2020, 13:30-14:30 (4 years ago)

Abstract: Widely used algorithms for smooth nonconvex optimization problems - unconstrained, bound-constrained, and general equality-constrained - can be modified slightly to ensure that approximate first- and second-order optimal points are found, with complexity guarantees that depend on the desired accuracy. We discuss methods constructed from Newton's method, conjugate gradients, randomized Lanczos, trust-region frameworks, log-barrier, and augmented Lagrangians. We derive upper bounds on various measures of complexity in terms of the tolerances required. Our methods use Hessian information only in the form of Hessian-vector products - an operation that does not require the Hessian itself to be evaluated or stored explicitly.

optimization and control

Audience: researchers in the discipline

Comments: the address and password of the zoom room of the seminar are sent by e-mail on the mailinglist of the seminar one day before each talk


One World Optimization seminar

Series comments: Description: Online seminar on optimization and related areas

The address and password of the zoom room of the seminar are sent by e-mail on the mailinglist of the seminar one day before each talk

Organizers: Sorin-Mihai Grad*, Radu Ioan BoČ›, Shoham Sabach, Mathias Staudigl
*contact for this listing

Export talk to