Picture for Nikita Doikov

Nikita Doikov

Improving Stochastic Cubic Newton with Momentum

Add code
Oct 25, 2024
Figure 1 for Improving Stochastic Cubic Newton with Momentum
Figure 2 for Improving Stochastic Cubic Newton with Momentum
Figure 3 for Improving Stochastic Cubic Newton with Momentum
Viaarxiv icon

Cubic regularized subspace Newton for non-convex optimization

Add code
Jun 24, 2024
Viaarxiv icon

First and zeroth-order implementations of the regularized Newton method with lazy approximated Hessians

Add code
Sep 05, 2023
Viaarxiv icon

Minimizing Quasi-Self-Concordant Functions by Gradient Regularization of Newton Method

Add code
Aug 28, 2023
Viaarxiv icon

Shuffle SGD is Always Better than SGD: Improved Analysis of SGD with Arbitrary Data Orders

Add code
Jun 15, 2023
Figure 1 for Shuffle SGD is Always Better than SGD: Improved Analysis of SGD with Arbitrary Data Orders
Figure 2 for Shuffle SGD is Always Better than SGD: Improved Analysis of SGD with Arbitrary Data Orders
Figure 3 for Shuffle SGD is Always Better than SGD: Improved Analysis of SGD with Arbitrary Data Orders
Figure 4 for Shuffle SGD is Always Better than SGD: Improved Analysis of SGD with Arbitrary Data Orders
Viaarxiv icon

Linearization Algorithms for Fully Composite Optimization

Add code
Feb 24, 2023
Viaarxiv icon

Unified Convergence Theory of Stochastic and Variance-Reduced Cubic Newton Methods

Add code
Feb 23, 2023
Viaarxiv icon

Polynomial Preconditioning for Gradient Methods

Add code
Jan 30, 2023
Viaarxiv icon

Second-order optimization with lazy Hessians

Add code
Dec 13, 2022
Viaarxiv icon

Super-Universal Regularized Newton Method

Add code
Aug 11, 2022
Figure 1 for Super-Universal Regularized Newton Method
Figure 2 for Super-Universal Regularized Newton Method
Figure 3 for Super-Universal Regularized Newton Method
Viaarxiv icon