Bolstering Stochastic Gradient Descent with Model Building

Özgür Martin (MSGSÜ)

03-Dec-2021, 14:00-15:00 (2 years ago)

Abstract: Stochastic gradient descent method and its variants constitute the core optimization algorithms that achieve good convergence rates for solving machine learning problems. These rates are obtained especially when these algorithms are fine-tuned for the application at hand. Although this tuning process can require large computational costs, recent work has shown that these costs can be reduced by line search methods that iteratively adjust the stepsize. In this talk, we will introduce an alternative approach to stochastic line search by using a new algorithm based on forward step model building. This model building step incorporates a second-order information that allows adjusting not only the stepsize but also the search direction.

This is a joint work with S. I. Birbil, G. Onay, and F. Öztoprak.

machine learningMathematics

Audience: general audience


Mimar Sinan University Mathematics Seminars

Curator: İpek Tuvay*
*contact for this listing

Export talk to