NSCS Seminar- Nonlinear, Statistical, Complexity, Soft Matter

Usual Time
Tuesday
Place
Zoom links
More Details

We are happy to announce the beginning of the NSCS (Nonlinear/Statistical/Complex/Soft-Matter) webinar series. 

 

To receive regular information on the webinars please register at https://softmatterisrael.wixsite.com/nscs  where you can also find details about future talks. 

Upcoming Lectures
No lecture is scheduled yet.
Previous Lectures
- Extreme Value Statistics: An overview and perspectives Satya Majumdar
Satya Majumdar
-

https://huji.zoom.us/j/88092252867?pwd=aWpRa3RPVGorYjBWa2MwMllsaFJIdz09…

 

Abstract: Extreme value statistics (EVS) concerns the study of the statistics of the maximum or the minimum of a set of random variables. This is an important problem for any time-series and has applications in climate, finance, sports, all the way to physics of disordered systems where one is interested in the statistics of the ground state energy. While the EVS of `uncorrelated' variables is well understood, very few results are known for strongly correlated random variables. Only recently this subject has gained much importance both in statistical physics and in probability theory. In this talk, I will give an overview and perspectives on this interdisciplinary and rapidly evolving area of research.

- Landscape and training regimes in deep learning Prof. Matthieu Wyart - EPFL
Prof. Matthieu Wyart - EPFL
-

Deep learning algorithms are responsible for a technological revolution in a variety of tasks, yet understanding why they work remains a challenge. Puzzles include that (i) learning corresponds to minimizing a loss in high dimension, which is in general not convex and could well get stuck in bad minima. (ii) Deep learning predicting power increases with the number of fitting parameters, even in a regime where data are perfectly fitted. I will review recent results on these questions based on analogies with physical systems and scaling arguments testable on real data. For classification, the landscape in deep learning displays a sharp “jamming” transition and becomes glassy as the number of parameters is lowered. This transition also occurs in the packing problem of non-spherical particles. In the over-parametrized regime where the landscape has many flat directions, learning can operate in two regimes “Feature Learning” and “Lazy training” depending on the scale of initialisation. I will provide and test a quantitative explanation as to why performance increases with the number of parameters in both regimes. I will discuss the relative merits of these regimes based on empirical evidence and simple models. If time permits, I will discuss empirical observations based on a maximal entropy model for diffeomorphisms supporting that stability toward smooth transformations is critical to the success of state of the art architectures.

 

NSCS is inviting you to a scheduled Zoom meeting. Topic: Prof. Matthieu Wyart - NSCS Online Seminar Time: May 4, 2021 04:00 PM Jerusalem Join Zoom Meeting https://huji.zoom.us/j/89708006582pwd=QjgvR2hPSitWOHh2eWI2VEVjZ0hOUT09   Meeting ID: 897 0800 6582 Passcode: 536812 One tap mobile +97239786688,,89708006582#,,,,*536812# Israel +972553301762,,89708006582#,,,,*536812# Israel Dial by your location +972 3 978 6688 Israel +972 55 330 1762 Israel +1 669 900 6833 US (San Jose) +1 929 205 6099 US (New York) +1 253 215 8782 US (Tacoma) +1 301 715 8592 US (Washington DC) +1 312 626 6799 US (Chicago) +1 346 248 7799 US (Houston) Meeting ID: 897 0800 6582 Passcode: 536812 Find your local number: https://huji.zoom.us/u/kcJ685keE Join by Skype for Business https://huji.zoom.us/skype/89708006582