The area of spectral analysis has a traditional dichotomy between continuous spectra (spectral densities) which correspond to purely nondeterministic processes, and line spectra (Dirac impulses) which represent sinusoids. While the former case is important in the identification of discrete-time linear stochastic systems, the latter case is essential for the analysis and modeling of time series with notable applications in radar systems. In this paper, we develop a novel approach for line spectral estimation which combines ideas of Georgiou's filter banks (G-filters) and atomic norm minimization (ANM), a mainstream method for line spectral analysis in the last decade following the theory of compressed sensing. Such a combination is only possible because a Carath\'{e}odory--Fej\'{e}r-type decomposition is available for the covariance matrix of the filter output. The ANM problem can be characterized via semidefinite programming which can be solved efficiently. As a consequence, our optimization theory can be seen as a substantial generalization of the standard ANM for line spectral estimation. Moreover, our ANM approach with a G-filter has significant advantages over subspace methods because it can work with just one output vector and without \emph{a priori} knowledge about the number of sinusoids in the input. Simulation results show that our approach performs reasonably well under different signal-to-noise ratios when the G-filter is suitably designed.