Variational inference transforms posterior inference into parametric optimization thereby enabling the use of latent variable models where otherwise impractical. However, variational inference can be finicky when different variational parameters control variables that are strongly correlated under the model. Traditional natural gradients based on the variational approximation fail to correct for correlations when the approximation is not the true posterior. To address this, we construct a new natural gradient called the variational predictive natural gradient. It is constructed as an average of the Fisher information of the reparameterized predictive model distribution. Unlike traditional natural gradients for variational inference, this natural gradient accounts for the relationship between model parameters and variational parameters. We also show the variational predictive natural gradient relates to the negative Hessian of the expected log-likelihood. A simple example shows the insight. We demonstrate the empirical value of our method on a classification task, a deep generative model of images, and probabilistic matrix factorization for recommendation.