Picture for Fangyu Zou

Fangyu Zou

Towards Practical Adam: Non-Convexity, Convergence Theory, and Mini-Batch Acceleration

Add code
Jan 14, 2021
Figure 1 for Towards Practical Adam: Non-Convexity, Convergence Theory, and Mini-Batch Acceleration
Figure 2 for Towards Practical Adam: Non-Convexity, Convergence Theory, and Mini-Batch Acceleration
Figure 3 for Towards Practical Adam: Non-Convexity, Convergence Theory, and Mini-Batch Acceleration
Figure 4 for Towards Practical Adam: Non-Convexity, Convergence Theory, and Mini-Batch Acceleration
Viaarxiv icon

A Sufficient Condition for Convergences of Adam and RMSProp

Add code
Nov 23, 2018
Figure 1 for A Sufficient Condition for Convergences of Adam and RMSProp
Figure 2 for A Sufficient Condition for Convergences of Adam and RMSProp
Figure 3 for A Sufficient Condition for Convergences of Adam and RMSProp
Viaarxiv icon

On the Convergence of Weighted AdaGrad with Momentum for Training Deep Neural Networks

Add code
Sep 28, 2018
Figure 1 for On the Convergence of Weighted AdaGrad with Momentum for Training Deep Neural Networks
Figure 2 for On the Convergence of Weighted AdaGrad with Momentum for Training Deep Neural Networks
Figure 3 for On the Convergence of Weighted AdaGrad with Momentum for Training Deep Neural Networks
Figure 4 for On the Convergence of Weighted AdaGrad with Momentum for Training Deep Neural Networks
Viaarxiv icon