We give an algorithm for completing an order-$m$ symmetric low-rank tensor from its multilinear entries in time roughly proportional to the number of tensor entries. We apply our tensor completion algorithm to the problem of learning mixtures of product distributions over the hypercube, obtaining new algorithmic results. If the centers of the product distribution are linearly independent, then we recover distributions with as many as $\Omega(n)$ centers in polynomial time and sample complexity. In the general case, we recover distributions with as many as $\tilde\Omega(n)$ centers in quasi-polynomial time, answering an open problem of Feldman et al. (SIAM J. Comp.) for the special case of distributions with incoherent bias vectors. Our main algorithmic tool is the iterated application of a low-rank matrix completion algorithm for matrices with adversarially missing entries.