Identifying, formalizing and combining biological mechanisms which implement known brain functions, such as prediction, is a main aspect of current research in theoretical neuroscience. In this letter, the mechanisms of Spike Timing Dependent Plasticity (STDP) and homeostatic plasticity, combined in an original mathematical formalism, are shown to shape recurrent neural networks into predictors. Following a rigorous mathematical treatment, we prove that they implement the online gradient descent of a distance between the network activity and its stimuli. The convergence to an equilibrium, where the network can spontaneously reproduce or predict its stimuli, does not suffer from bifurcation issues usually encountered in learning in recurrent neural networks.