BEGIN:VCALENDAR
VERSION:2.0
PRODID:researchseminars.org
CALSCALE:GREGORIAN
X-WR-CALNAME:researchseminars.org
BEGIN:VEVENT
SUMMARY:Tamara G. Kolda (Sandia National Laboratories)
DTSTART:20200701T140000Z
DTEND:20200701T150000Z
DTSTAMP:20260423T041507Z
UID:E-NLA/11
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/E-NLA/11/">P
 ractical Leverage-Based Sampling for Low-Rank Tensor Decomposition</a>\nby
  Tamara G. Kolda (Sandia National Laboratories) as part of E-NLA - Online 
 seminar series on numerical linear algebra\n\n\nAbstract\nConventional alg
 orithms for finding low-rank canonical polyadic (CP) tensor decompositions
  are unwieldy for large sparse tensors. The CP decomposition can be comput
 ed by solving a sequence of overdetermined least problems with special Kha
 tri-Rao structure. In this work\, we present an application of randomized 
 algorithms to fitting the CP decomposition of sparse tensors\, solving a s
 ignificantly smaller sampled least squares problem at each iteration with 
 probabilistic guarantees on the approximation errors. Prior work has shown
  that sketching is effective in the dense case\, but the prior approach ca
 nnot be applied to the sparse case because a fast Johnson-Lindenstrauss tr
 ansform (e.g.\, using a fast Fourier transform) must be applied in each mo
 de\, causing the sparse tensor to become dense. Instead\, we perform sketc
 hing through leverage score sampling\, crucially relying on the fact that 
 the structure of the Khatri-Rao product allows sampling from overestimates
  of the leverage scores without forming the full product or the correspond
 ing probabilities. Naïve application of leverage score sampling is ineffe
 ctive because we often have cases where a few scores are quite large\, so 
 we propose a novel hybrid of deterministic and random leverage-score sampl
 ing which consistently yields improved fits. Numerical results on real-wor
 ld large-scale tensors show the method is significantly faster than compet
 ing methods without sacrificing accuracy.  This is joint work with Brett L
 arsen\, Stanford University.\n
LOCATION:https://researchseminars.org/talk/E-NLA/11/
END:VEVENT
END:VCALENDAR
