Machine Learning Seminar

Machine Learning Seminar

Gilad Yehudai


Weizmann Institute

On Lottery Tickets and Size Generalization in Graph Neural Networks

In this talk, I will survey two recent works, both have results related to the expressivity of neural networks, but in different architectures: On size generalization in graph neural networks: Graph neural networks can process graphs of any size, a natural question that arises is whether they are able to generalize across sizes, i.e. is it possible to learn on small graphs and successfully predict on larger graphs. We will show that the answer might be related to the local structure of the graph, rather than its global structure, prove an expressivity result on how GNNs learn local structures, and try to improve size generalization using these insights. Proving the lottery ticket hypothesis: The recent prominent lottery ticket hypothesis (Frankle and Carbin, 2018) states that a randomly-initialized network contains a small subnetwork such that, when trained in isolation, can compete with the performance of the original network. An even stronger hypothesis (Ramanujan et al. 2019) states that a sufficiently over-parameterized neural network with random weights contains a subnetwork that without training can compete with the full network after training. We prove the stronger hypothesis and compare different methods for neural network pruning. Based on joint work with: Ohad Shamir, Haggai Maron, Eran Malach, Gal Chechik, Shai Shalev- Shwartz, Eli Meirom and Ethan Fetaya. * Gilad Yehudai is a Ph.D. student in Weizmann Institute under the supervision of Professor Ohad Shamir. Zoom link:

Date: Sun 22 Nov 2020

Start Time: 11:30

End Time: 12:30

ZOOM Meeting | Electrical Eng. Building