Picture for Jorge Nocedal

Jorge Nocedal

A Trust-Region Algorithm for Noisy Equality Constrained Optimization

Add code
Nov 04, 2024
Viaarxiv icon

Constrained and Composite Optimization via Adaptive Sampling Methods

Add code
Dec 31, 2020
Figure 1 for Constrained and Composite Optimization via Adaptive Sampling Methods
Figure 2 for Constrained and Composite Optimization via Adaptive Sampling Methods
Figure 3 for Constrained and Composite Optimization via Adaptive Sampling Methods
Figure 4 for Constrained and Composite Optimization via Adaptive Sampling Methods
Viaarxiv icon

An Investigation of Newton-Sketch and Subsampled Newton Methods

Add code
Jul 24, 2018
Figure 1 for An Investigation of Newton-Sketch and Subsampled Newton Methods
Figure 2 for An Investigation of Newton-Sketch and Subsampled Newton Methods
Figure 3 for An Investigation of Newton-Sketch and Subsampled Newton Methods
Figure 4 for An Investigation of Newton-Sketch and Subsampled Newton Methods
Viaarxiv icon

A Progressive Batching L-BFGS Method for Machine Learning

Add code
May 30, 2018
Figure 1 for A Progressive Batching L-BFGS Method for Machine Learning
Figure 2 for A Progressive Batching L-BFGS Method for Machine Learning
Figure 3 for A Progressive Batching L-BFGS Method for Machine Learning
Figure 4 for A Progressive Batching L-BFGS Method for Machine Learning
Viaarxiv icon

Optimization Methods for Large-Scale Machine Learning

Add code
Feb 08, 2018
Figure 1 for Optimization Methods for Large-Scale Machine Learning
Figure 2 for Optimization Methods for Large-Scale Machine Learning
Figure 3 for Optimization Methods for Large-Scale Machine Learning
Figure 4 for Optimization Methods for Large-Scale Machine Learning
Viaarxiv icon

Adaptive Sampling Strategies for Stochastic Optimization

Add code
Oct 30, 2017
Figure 1 for Adaptive Sampling Strategies for Stochastic Optimization
Figure 2 for Adaptive Sampling Strategies for Stochastic Optimization
Figure 3 for Adaptive Sampling Strategies for Stochastic Optimization
Figure 4 for Adaptive Sampling Strategies for Stochastic Optimization
Viaarxiv icon

On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima

Add code
Feb 09, 2017
Figure 1 for On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
Figure 2 for On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
Figure 3 for On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
Figure 4 for On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
Viaarxiv icon

A Multi-Batch L-BFGS Method for Machine Learning

Add code
Oct 23, 2016
Figure 1 for A Multi-Batch L-BFGS Method for Machine Learning
Figure 2 for A Multi-Batch L-BFGS Method for Machine Learning
Figure 3 for A Multi-Batch L-BFGS Method for Machine Learning
Figure 4 for A Multi-Batch L-BFGS Method for Machine Learning
Viaarxiv icon

Exact and Inexact Subsampled Newton Methods for Optimization

Add code
Sep 27, 2016
Figure 1 for Exact and Inexact Subsampled Newton Methods for Optimization
Figure 2 for Exact and Inexact Subsampled Newton Methods for Optimization
Figure 3 for Exact and Inexact Subsampled Newton Methods for Optimization
Figure 4 for Exact and Inexact Subsampled Newton Methods for Optimization
Viaarxiv icon