Machine Learning Seminar

Machine Learning Seminar

Noam Razin


Tel Aviv University

Implicit Regularization in Tensor Factorization

The mysterious ability of deep neural networks to generalize is believed to stem from an implicit regularization, a tendency of gradient-based optimization to fit training data with predictors of low “complexity.” A major challenge in formalizing this intuition is that we lack measures of complexity that are both quantitative and capture the essence of data which admits generalization (images, audio, text, etc.). With an eye towards this challenge, I will present the first analysis of implicit regularization in tensor factorization, equivalent to a certain type of non-linear neural networks. Through a dynamical characterization, I will establish an implicit regularization towards low tensor rank. Then, motivated by tensor rank capturing implicit regularization of non-linear neural networks, I will suggest it as a measure of complexity, and show that it stays extremely low when fitting standard datasets. This gives rise to the possibility of tensor rank shedding light on both implicit regularization of neural networks, and the properties of real-world data translating this implicit regularization to generalization. Based on joint work with Asaf Maman and Nadav Cohen. * Noam Razin is a PhD student at Tel Aviv University under the supervision of Prof. Nadav Cohen. Zoom link:

תאריך: Wed 24 Mar 2021

Start Time: 10:30

End Time: 11:30

Zoom meeting | Electrical Eng. Building