Smoothness in Nonsmooth Optimization

Adrian Lewis (ORIE Cornell)

21-Sep-2020, 13:30-14:30 (4 years ago)

Abstract: Fast black-box nonsmooth optimization, while theoretically out of reach in the worst case, has long been an intriguing goal in practice. Generic concrete nonsmooth objectives are "partly" smooth: their subdifferentials have locally smooth graphs with powerful constant-rank properties, often associated with hidden structure in the objective. One typical example is the proximal mapping for the matrix numerical radius, whose output is surprisingly often a "disk" matrix. Motivated by this expectation of partial smoothness, this talk describes a Newtonian black-box algorithm for general nonsmooth optimization. Local convergence is provably superlinear on a representative class of objectives, and early numerical experience is promising more generally.

Joint work with Dima Drusvyatskiy, XY Han, Alex Ioffe, Jingwei Liang, Michael Overton, Tonghua Tian, Calvin Wylie

optimization and control

Audience: researchers in the discipline

Comments: the address and password of the zoom room of the seminar are sent by e-mail on the mailinglist of the seminar one day before each talk


One World Optimization seminar

Series comments: Description: Online seminar on optimization and related areas

The address and password of the zoom room of the seminar are sent by e-mail on the mailinglist of the seminar one day before each talk

Organizers: Sorin-Mihai Grad*, Radu Ioan BoČ›, Shoham Sabach, Mathias Staudigl
*contact for this listing

Export talk to