Picture for Peter Súkeník

Peter Súkeník

Wide Neural Networks Trained with Weight Decay Provably Exhibit Neural Collapse

Add code
Oct 07, 2024
Figure 1 for Wide Neural Networks Trained with Weight Decay Provably Exhibit Neural Collapse
Figure 2 for Wide Neural Networks Trained with Weight Decay Provably Exhibit Neural Collapse
Figure 3 for Wide Neural Networks Trained with Weight Decay Provably Exhibit Neural Collapse
Figure 4 for Wide Neural Networks Trained with Weight Decay Provably Exhibit Neural Collapse
Viaarxiv icon

Neural Collapse versus Low-rank Bias: Is Deep Neural Collapse Really Optimal?

Add code
May 23, 2024
Viaarxiv icon

Average gradient outer product as a mechanism for deep neural collapse

Add code
Feb 21, 2024
Viaarxiv icon

Deep Neural Collapse Is Provably Optimal for the Deep Unconstrained Features Model

Add code
May 22, 2023
Viaarxiv icon

The Unreasonable Effectiveness of Fully-Connected Layers for Low-Data Regimes

Add code
Oct 13, 2022
Figure 1 for The Unreasonable Effectiveness of Fully-Connected Layers for Low-Data Regimes
Figure 2 for The Unreasonable Effectiveness of Fully-Connected Layers for Low-Data Regimes
Figure 3 for The Unreasonable Effectiveness of Fully-Connected Layers for Low-Data Regimes
Figure 4 for The Unreasonable Effectiveness of Fully-Connected Layers for Low-Data Regimes
Viaarxiv icon

Generalization In Multi-Objective Machine Learning

Add code
Aug 29, 2022
Figure 1 for Generalization In Multi-Objective Machine Learning
Figure 2 for Generalization In Multi-Objective Machine Learning
Viaarxiv icon

Intriguing Properties of Input-dependent Randomized Smoothing

Add code
Oct 11, 2021
Figure 1 for Intriguing Properties of Input-dependent Randomized Smoothing
Figure 2 for Intriguing Properties of Input-dependent Randomized Smoothing
Figure 3 for Intriguing Properties of Input-dependent Randomized Smoothing
Figure 4 for Intriguing Properties of Input-dependent Randomized Smoothing
Viaarxiv icon