In this technical report we describe some properties of f-divergences and f-GAN training. We present an elementary derivation of the f-divergence lower bounds which form the basis of f-GAN training. We derive informative but perhaps underappreciated properties of f-divergences and f-GAN training, including a gradient matching property and the fact that all f-divergences agree up to an overall scale factor on the divergence between nearby distributions. We provide detailed expressions for computing various common f-divergences and their variational lower bounds. Finally, based on our reformulation, we slightly generalize f-GAN training in a way that may improve its stability.