BEGIN:VCALENDAR
VERSION:2.0
PRODID:researchseminars.org
CALSCALE:GREGORIAN
X-WR-CALNAME:researchseminars.org
BEGIN:VEVENT
SUMMARY:Yuh-Jye Lee (Academia Sinica)
DTSTART:20231025T060000Z
DTEND:20231025T070000Z
DTSTAMP:20260423T035816Z
UID:MAC8028/5
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/MAC8028/5/">
 Federated Learning for Sparse Principal Component Analysis</a>\nby Yuh-Jye
  Lee (Academia Sinica) as part of Trends in Mathematical Research\n\nLectu
 re held in NTNU Gongguan S101.\n\nAbstract\nIn the rapidly evolving realm 
 of machine learning\, algorithm effectiveness often faces limitations due 
 to data quality and availability. Traditional approaches grapple with data
  sharing due to legal and privacy concerns. The federated learning framewo
 rk addresses this challenge. Federated learning is a decentralized approac
 h where model training occurs on client sides\, preserving privacy by keep
 ing data localized. Instead of sending raw data to a central server\, only
  model updates are exchanged\, enhancing data security. We apply this fram
 ework to Sparse Principal Component Analysis (SPCA) in this work. SPCA aim
 s to attain sparse component loading while maximizing data variance for im
 proved interpretability. We introduce a least squares approximation to ori
 ginal PCA\, adding an $\\ell_1$ norm regularization term to enhance princi
 pal component sparsity\, aiding variable identification and interpretabili
 ty. The problem is formulated as a consensus optimization challenge and so
 lved using the Alternating Direction Method of Multipliers (ADMM). Our ext
 ensive experiments involve both IID and non-IID random features across var
 ious data owners. Results on synthetic and public datasets affirm the effi
 cacy of our federated SPCA approach.\n
LOCATION:https://researchseminars.org/talk/MAC8028/5/
END:VEVENT
END:VCALENDAR
