SMRI Seminar Series: Machine learning for the working mathematician
machine learning Mathematics
Sydney Mathematical Research Institute
Audience: | Researchers in the topic |
Seminar series time: | Thursday 04:00-06:00 in your time zone, UTC |
Curator: | SMRIAdmin* |
*contact for this listing |
SMRI Seminar Series: Machine learning for the working mathematician
Thursdays 3-5pm (AEDT)
sites.google.com/view/mlwm-seminar-2022
Joel Gibson, Georg Gottwald, and Geordie Williamson are organising a SMRI seminar series called "Machine learning for the working mathematician" which aims to look at how machine learning can be (and has been) used to solve problems in mathematics. All lectures will be hybrid and recorded (the above website will contain details soon!).
We are aiming to spend the first half of the seminar introducing the audience to modern deep learning techniques, and the second half having talks by experts on applications to both pure and applied mathematics problems. (We are not trying to do research on machine learning itself, but rather learning about it as a tool to leverage in our own research). The seminar will be once a week, on Thursdays at 3pm-5pm (details on the website), starting in Week 1. We also hope to host tutorials every second week, so that participants can get their hands dirty with actual models.
Two nice examples of recent work that give the 'flavour' of the seminar are:
* Advancing mathematics by guiding human intuition with AI (https://www.nature.com/articles/s41586-021-04086-x), a collaboration with Google DeepMind which led them to produce a surprising new conjecture relating to representation theory, and;
* Constructions in combinatorics via neural networks (https://arxiv.org/abs/2104.14516>), where the author uses ML strategies to come up with many counterexamples to conjectures in graph theory and other combinatorial problems.
Seminar Schedule The seminar takes place during the first half of 2022, at the University of Sydney. The first few weeks will run on Thursdays at 3pm-5pm in Carslaw 273 (see a map), starting from the first week of the semester. There is no need to sign up, everyone is welcome! Come along in person or log on to zoom: the password is the first 8 letters of the word BackPropagation (note the capitalisation). Each lecture will be recorded and made available online.
The workshop sessions run at Friday 3pm-4pm in Carslaw 273 (same room as the seminar), where attendees can get some hands-on experience with applying machine learning to mathematical problems. The workshop sessions will involve Google Colab notebooks. The workshop sessions will be in-person only, but the notebooks will be made available online.
NEW! We have a discussion board, so we can keep chatting after the lectures and workshops. Sign up - the 8-letter invite code is the same as the Zoom password.
Week 1 Seminar: Thursday 24th February, Carslaw 273, 3pm - 5pm (world clock)
Geordie Williamson, Basics of Machine Learning: classic problems in machine learning, kernel methods, deep neural networks, supervised learning, and basic examples (Lecture notes, lecture recording).
Week 1 Workshop: Friday 25th February, Carslaw 273, 3pm - 4pm. Introduction to Pytorch (Notebook).
Week 2: Thursday 3rd March, Carslaw 273, 3pm - 5pm (world clock)
Joel Gibson, What can and can't neural networks do: Universal approximation theorem and convolutional neural networks (Lecture notes, lecture recording).
Lecture links: Tensorflow playground, approximation by bumps/ridges, convolution filters.
Week 2 Workshop: Friday 4th March, Carslaw 273, 3pm - 4pm. Predicting the Möbius function (Notebook).
Week 3: Thursday 10th March, Carslaw 273, 3pm - 5pm (world clock)
Georg Gottwald, How to think about machine learning: borrowing from statistical mechanics, dynamical systems and numerical analysis to better understand deep learning. (Lecture notes, Lecture recording).
Week 3 Workshop: Friday 10th March, Carslaw 273, 3pm - 4pm. Continuation of Week 3 (Notebook).
Week 4: Thursday 17th March, Carslaw 273, 3pm - 5pm (world clock)
Joel Gibson and Georg Gottwald, Regularisation and Recurrent Neural Nets. (Lecture notes 1, Lecture notes 2, Lecture recording 1, Lecture recording 2).
Week 4 Workshop: Friday 17th March, Carslaw 273, 3pm - 4pm. (Notebook).
Week 5: Thursday 24th March, Carslaw 273, 3pm - 5pm (world clock)
Geordie Williamson, Geometric Deep Learning (Lecture notes, Lecture recording)
Week 5 Workshop: Friday 25th March, 3pm - 4pm, online. (Use the usual zoom link). (Notebook)
Week 6: Thursday 31st March, Carslaw 273, 3pm - 5pm (world clock)
Georg Gottwald, Geometric Deep Learning II, and Geordie Williamson, Saliency and Combinatorial Invariance (Lecture notes 1, Lecture notes 2, Lecture recording 1, Lecture recording 2)
Week 6 Workshop: Friday 1st April, 3pm - 4pm, Online! (Use the usual zoom link) (Notebook).
Week 7: Thursday 7th April, Online, 3pm - 5pm (world clock)
Adam Zsolt Wagner: A simple RL setup to find counterexamples to conjectures in mathematics
Abstract: In this talk, we introduce some deep learning based approaches for modelling sequence to sequence relationships that are gaining popularity in many applied fields, such as time-series analysis, natural language processing, and data-driven science and engineering. We will also discuss some interesting mathematical issues underlying these methodologies, including approximation theory and optimization dynamics.
Related paper: Constructions in combinatorics via neural networks.
Week 8: Thursday 14th April, Online, 3pm - 5pm (world clock)
Bamdad Hosseini
(Thursday 21st April: Midsemester break, no seminar!)
Week 9: Thursday 28th April, Online, 3pm - 5pm (world clock)
Carlos Simpson, Learning proofs for the classification of nilpotent semigroups
Week 10: Thursday 5th May
Week 11: Thursday 12th (Note: 9am Sydney time, online) (world clock)
Daniel Halpern-Leinster, Learning selection strategies in Buchberger's algorithm
Week 12: Thursday 19th May
Week 13: Thursday 26th May, Online, 3pm - 5pm (world clock)
Qianxiao Li, Deep learning for sequence modelling.
Abstract: In this talk, we introduce some deep learning based approaches for modelling sequence to sequence relationships that are gaining popularity in many applied fields, such as time-series analysis, natural language processing, and data-driven science and engineering. We will also discuss some interesting mathematical issues underlying these methodologies, including approximation theory and optimization dynamics.
Your time | Speaker | Title | |||
---|---|---|---|---|---|
Thu | May 26 | 05:00 | Qianxiao Li | Deep learning for sequence modelling | |
Thu | May 19 | 06:00 | Lars Buesing | Searching for Formulas and Algorithms: Symbolic Regression and Program Induction | |
Wed | May 11 | 23:00 | Daniel Halpern-Leinster | Learning selection strategies in Buchberger's algorithm | |
Thu | May 05 | 06:00 | Alex Davies | A technical history of AlphaZero | |
Thu | Apr 07 | 05:00 | Adam Zsolt Wagner | A simple RL setup to find counterexamples to conjectures in mathematics | |
Thu | Feb 24 | 04:00 | Geordie Williamson | TBA |