BEGIN:VCALENDAR
VERSION:2.0
PRODID:researchseminars.org
CALSCALE:GREGORIAN
X-WR-CALNAME:researchseminars.org
BEGIN:VEVENT
SUMMARY:Betty Shea (UBC Vancouver)
DTSTART:20241114T220000Z
DTEND:20241114T230000Z
DTSTAMP:20260513T193639Z
UID:SFUOR/43
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/SFUOR/43/">W
 hy Line-Search When You Can Plane-Search?</a>\nby Betty Shea (UBC Vancouve
 r) as part of PIMS-CORDS SFU Operations Research Seminar\n\nLecture held i
 n ASB 10908.\n\nAbstract\nThe practical performance of an optimization met
 hod depends on details such as using good step sizes. Strategies for setti
 ng step sizes are generally limited to hyperparameter tuning (for a fixed 
 step size)\, step size schedules and line searches. For many common machin
 e learning problems\, line optimization and subspace optimization find acc
 urate step sizes for asymptotically the same cost as using a fixed step si
 ze. In some cases\, line optimization may find step sizes that are ruled o
 ut by the standard Armijo condition. For optimization methods that use mul
 tiple search directions\, such as gradient descent with momentum\, using s
 ubspace optimization instead of fixed step size schedules allow for better
  adaptivity and potentially faster convergence. In the case of some neural
  networks\, subspace optimization allows the use of different step sizes f
 or different layers that could decrease the amount of training time needed
 \, as well as reducing the dependence on hyperparameter tuning.\n
LOCATION:https://researchseminars.org/talk/SFUOR/43/
END:VEVENT
END:VCALENDAR
