Picture for Zohar Ringel

Zohar Ringel

Symmetric Kernels with Non-Symmetric Data: A Data-Agnostic Learnability Bound

Add code
Jun 04, 2024
Viaarxiv icon

Wilsonian Renormalization of Neural Network Gaussian Processes

Add code
May 09, 2024
Viaarxiv icon

Towards Understanding Inductive Bias in Transformers: A View From Infinity

Add code
Feb 07, 2024
Viaarxiv icon

Droplets of Good Representations: Grokking as a First Order Phase Transition in Two Layer Networks

Add code
Oct 05, 2023
Viaarxiv icon

Speed Limits for Deep Learning

Add code
Jul 27, 2023
Viaarxiv icon

Spectral-Bias and Kernel-Task Alignment in Physically Informed Neural Networks

Add code
Jul 12, 2023
Viaarxiv icon

Separation of scales and a thermodynamic description of feature learning in some CNNs

Add code
Dec 31, 2021
Figure 1 for Separation of scales and a thermodynamic description of feature learning in some CNNs
Figure 2 for Separation of scales and a thermodynamic description of feature learning in some CNNs
Figure 3 for Separation of scales and a thermodynamic description of feature learning in some CNNs
Viaarxiv icon

A self consistent theory of Gaussian Processes captures feature learning effects in finite CNNs

Add code
Jun 08, 2021
Figure 1 for A self consistent theory of Gaussian Processes captures feature learning effects in finite CNNs
Figure 2 for A self consistent theory of Gaussian Processes captures feature learning effects in finite CNNs
Figure 3 for A self consistent theory of Gaussian Processes captures feature learning effects in finite CNNs
Figure 4 for A self consistent theory of Gaussian Processes captures feature learning effects in finite CNNs
Viaarxiv icon

Predicting the outputs of finite networks trained with noisy gradients

Add code
Apr 02, 2020
Figure 1 for Predicting the outputs of finite networks trained with noisy gradients
Figure 2 for Predicting the outputs of finite networks trained with noisy gradients
Figure 3 for Predicting the outputs of finite networks trained with noisy gradients
Figure 4 for Predicting the outputs of finite networks trained with noisy gradients
Viaarxiv icon

Learning Curves for Deep Neural Networks: A Gaussian Field Theory Perspective

Add code
Jun 12, 2019
Figure 1 for Learning Curves for Deep Neural Networks: A Gaussian Field Theory Perspective
Viaarxiv icon