Generalized Newton Algorithms For Nonsmooth Systems With Applications To Lasso

Boris Mordukhovich (Wayne State University)

01-Feb-2021, 14:30-15:30 (3 years ago)

Abstract: We propose and develop several generalized Newton-type algorithms to solve nonsmooth optimization problems and subgradient systems that are based on constructions and results of (mainly second-order) variational analysis and generalized differentiation. Solvability of these algorithms is proved in rather broad settings, and then verifiable conditions for their local and global superlinear convergence are obtained. A special attention is paid to problems of convex composite optimization for which a generalized damped Newton algorithm exhibiting global superlinear convergence is designed. The efficiency of the latter algorithm is demonstrated by solving a class of Lasso problems that are well-recognized in applications to machine learning and statistics. For this class of nonsmooth optimization problems, we conduct numerical experiments and compare the obtained results with those achieved by using other first-order and second-order methods.

This talk is based on recent joint works with P. D. Khanh (HCMUE), V. T. Phat (WSU), M. E. Sarabi (Miami Univ.), and D. B. Tran (WSU).

optimization and control

Audience: advanced learners

Comments: The address and password of the zoom room of the seminar are sent by e-mail on the mailinglist of the seminar one day before each talk


One World Optimization seminar

Series comments: Description: Online seminar on optimization and related areas

The address and password of the zoom room of the seminar are sent by e-mail on the mailinglist of the seminar one day before each talk

Organizers: Sorin-Mihai Grad*, Radu Ioan BoČ›, Shoham Sabach, Mathias Staudigl
*contact for this listing

Export talk to