Picture for Theodor Misiakiewicz

Theodor Misiakiewicz

On the Complexity of Learning Sparse Functions with Statistical and Gradient Queries

Add code
Jul 08, 2024
Figure 1 for On the Complexity of Learning Sparse Functions with Statistical and Gradient Queries
Figure 2 for On the Complexity of Learning Sparse Functions with Statistical and Gradient Queries
Viaarxiv icon

Dimension-free deterministic equivalents for random feature regression

Add code
May 24, 2024
Figure 1 for Dimension-free deterministic equivalents for random feature regression
Figure 2 for Dimension-free deterministic equivalents for random feature regression
Figure 3 for Dimension-free deterministic equivalents for random feature regression
Figure 4 for Dimension-free deterministic equivalents for random feature regression
Viaarxiv icon

Asymptotics of Random Feature Regression Beyond the Linear Scaling Regime

Add code
Mar 13, 2024
Viaarxiv icon

A non-asymptotic theory of Kernel Ridge Regression: deterministic equivalents, test error, and GCV estimator

Add code
Mar 13, 2024
Viaarxiv icon

Six Lectures on Linearized Neural Networks

Add code
Aug 25, 2023
Figure 1 for Six Lectures on Linearized Neural Networks
Figure 2 for Six Lectures on Linearized Neural Networks
Figure 3 for Six Lectures on Linearized Neural Networks
Figure 4 for Six Lectures on Linearized Neural Networks
Viaarxiv icon

SGD learning on neural networks: leap complexity and saddle-to-saddle dynamics

Add code
Feb 21, 2023
Figure 1 for SGD learning on neural networks: leap complexity and saddle-to-saddle dynamics
Figure 2 for SGD learning on neural networks: leap complexity and saddle-to-saddle dynamics
Figure 3 for SGD learning on neural networks: leap complexity and saddle-to-saddle dynamics
Figure 4 for SGD learning on neural networks: leap complexity and saddle-to-saddle dynamics
Viaarxiv icon

Spectrum of inner-product kernel matrices in the polynomial regime and multiple descent phenomenon in kernel ridge regression

Add code
Apr 21, 2022
Figure 1 for Spectrum of inner-product kernel matrices in the polynomial regime and multiple descent phenomenon in kernel ridge regression
Figure 2 for Spectrum of inner-product kernel matrices in the polynomial regime and multiple descent phenomenon in kernel ridge regression
Figure 3 for Spectrum of inner-product kernel matrices in the polynomial regime and multiple descent phenomenon in kernel ridge regression
Figure 4 for Spectrum of inner-product kernel matrices in the polynomial regime and multiple descent phenomenon in kernel ridge regression
Viaarxiv icon

The merged-staircase property: a necessary and nearly sufficient condition for SGD learning of sparse functions on two-layer neural networks

Add code
Feb 17, 2022
Figure 1 for The merged-staircase property: a necessary and nearly sufficient condition for SGD learning of sparse functions on two-layer neural networks
Figure 2 for The merged-staircase property: a necessary and nearly sufficient condition for SGD learning of sparse functions on two-layer neural networks
Figure 3 for The merged-staircase property: a necessary and nearly sufficient condition for SGD learning of sparse functions on two-layer neural networks
Figure 4 for The merged-staircase property: a necessary and nearly sufficient condition for SGD learning of sparse functions on two-layer neural networks
Viaarxiv icon

Learning with convolution and pooling operations in kernel methods

Add code
Nov 16, 2021
Figure 1 for Learning with convolution and pooling operations in kernel methods
Figure 2 for Learning with convolution and pooling operations in kernel methods
Figure 3 for Learning with convolution and pooling operations in kernel methods
Figure 4 for Learning with convolution and pooling operations in kernel methods
Viaarxiv icon

Minimum complexity interpolation in random features models

Add code
Mar 30, 2021
Figure 1 for Minimum complexity interpolation in random features models
Viaarxiv icon