Picture for Jérôme Bolte

Jérôme Bolte

TSE-R

A second-order-like optimizer with adaptive gradient scaling for deep learning

Add code
Oct 08, 2024
Viaarxiv icon

Inexact subgradient methods for semialgebraic functions

Add code
Apr 30, 2024
Viaarxiv icon

One-step differentiation of iterative algorithms

Add code
May 23, 2023
Viaarxiv icon

Differentiating Nonsmooth Solutions to Parametric Monotone Inclusion Problems

Add code
Dec 15, 2022
Viaarxiv icon

Nonsmooth automatic differentiation: a cheap gradient principle and other complexity results

Add code
Jun 01, 2022
Figure 1 for Nonsmooth automatic differentiation: a cheap gradient principle and other complexity results
Figure 2 for Nonsmooth automatic differentiation: a cheap gradient principle and other complexity results
Figure 3 for Nonsmooth automatic differentiation: a cheap gradient principle and other complexity results
Viaarxiv icon

Automatic differentiation of nonsmooth iterative algorithms

Add code
May 31, 2022
Figure 1 for Automatic differentiation of nonsmooth iterative algorithms
Figure 2 for Automatic differentiation of nonsmooth iterative algorithms
Viaarxiv icon

Numerical influence of ReLU'(0) on backpropagation

Add code
Jun 29, 2021
Figure 1 for Numerical influence of ReLU'(0) on backpropagation
Figure 2 for Numerical influence of ReLU'(0) on backpropagation
Figure 3 for Numerical influence of ReLU'(0) on backpropagation
Figure 4 for Numerical influence of ReLU'(0) on backpropagation
Viaarxiv icon

Nonsmooth Implicit Differentiation for Machine Learning and Optimization

Add code
Jun 08, 2021
Figure 1 for Nonsmooth Implicit Differentiation for Machine Learning and Optimization
Figure 2 for Nonsmooth Implicit Differentiation for Machine Learning and Optimization
Figure 3 for Nonsmooth Implicit Differentiation for Machine Learning and Optimization
Figure 4 for Nonsmooth Implicit Differentiation for Machine Learning and Optimization
Viaarxiv icon

Second-order step-size tuning of SGD for non-convex optimization

Add code
Mar 05, 2021
Figure 1 for Second-order step-size tuning of SGD for non-convex optimization
Figure 2 for Second-order step-size tuning of SGD for non-convex optimization
Figure 3 for Second-order step-size tuning of SGD for non-convex optimization
Figure 4 for Second-order step-size tuning of SGD for non-convex optimization
Viaarxiv icon

A Hölderian backtracking method for min-max and min-min problems

Add code
Jul 17, 2020
Figure 1 for A Hölderian backtracking method for min-max and min-min problems
Figure 2 for A Hölderian backtracking method for min-max and min-min problems
Viaarxiv icon