Provable Representation Learning
Simon Du (University of Washington)
Abstract: Representation learning has been widely used in many applications. In this talk, I will present our work, which uncovers when and why representation learning provably improves the sample efficiency, from a statistical learning point of view. I will show 1) the existence of a good representation among all tasks, and 2) the diversity of tasks are key conditions that permit improved statistical efficiency via multi-task representation learning. These conditions provably improve the sample efficiency for functions with certain complexity measures as the representation. If time permits, I will also talk about leveraging the theoretical insights to improve practical performance.
data structures and algorithmsmachine learningmathematical physicsinformation theoryoptimization and controldata analysis, statistics and probability
Audience: researchers in the topic
( video )
Mathematics, Physics and Machine Learning (IST, Lisbon)
Series comments: To receive the series announcements, please register in:
mpml.tecnico.ulisboa.pt
mpml.tecnico.ulisboa.pt/registration
Zoom link: videoconf-colibri.zoom.us/j/91599759679
Organizers: | Mário Figueiredo, Tiago Domingos, Francisco Melo, Jose Mourao*, Cláudia Nunes, Yasser Omar, Pedro Alexandre Santos, João Seixas, Cláudia Soares, João Xavier |
*contact for this listing |