In this paper we study Meta learning of Gaussian graphical models. In our setup, each task has a different true precision matrix, each with a possibly different support (i.e., set of edges in the graph). We assume that the union of the supports of all the true precision matrices (i.e., the true support union) is small in size, which relates to sparse graphs. We propose to pool all the samples from different tasks, and estimate a single precision matrix by $\ell_1$-regularized maximum likelihood estimation. We show that with high probability, the support of the estimated single precision matrix is equal to the true support union, provided a sufficient number of samples per task $n \in O((\log N)/K)$, for $N$ nodes and $K$ tasks. That is, one requires less samples per task when more tasks are available. We prove a matching information-theoretic lower bound for the necessary number of samples, which is $n \in \Omega((\log N)/K)$, and thus, our algorithm is minimax optimal. Synthetic experiments validate our theory.