Picture for Dennis Elbrächter

Dennis Elbrächter

Redistributor: Transforming Empirical Data Distributions

Add code
Oct 25, 2022
Viaarxiv icon

How degenerate is the parametrization of neural networks with the ReLU activation function?

Add code
May 23, 2019
Figure 1 for How degenerate is the parametrization of neural networks with the ReLU activation function?
Figure 2 for How degenerate is the parametrization of neural networks with the ReLU activation function?
Figure 3 for How degenerate is the parametrization of neural networks with the ReLU activation function?
Figure 4 for How degenerate is the parametrization of neural networks with the ReLU activation function?
Viaarxiv icon

Towards a regularity theory for ReLU networks -- chain rule and global error estimates

Add code
May 13, 2019
Figure 1 for Towards a regularity theory for ReLU networks -- chain rule and global error estimates
Figure 2 for Towards a regularity theory for ReLU networks -- chain rule and global error estimates
Viaarxiv icon

The Oracle of DLphi

Add code
Jan 27, 2019
Viaarxiv icon

Deep Neural Network Approximation Theory

Add code
Jan 08, 2019
Figure 1 for Deep Neural Network Approximation Theory
Figure 2 for Deep Neural Network Approximation Theory
Figure 3 for Deep Neural Network Approximation Theory
Viaarxiv icon

The universal approximation power of finite-width deep ReLU networks

Add code
Jun 05, 2018
Figure 1 for The universal approximation power of finite-width deep ReLU networks
Viaarxiv icon