We present a dual formulation of the Gaussian process (GP) regression model using random functions in a reproducing kernel Hilbert space (RKHS). Compared to the GP, the proposed dual GP can realize an expanded space of functions for trace-class covariance kernels. In particular, the covariance of the dual GP is indexed or parameterized by a sufficient dimension reduction subspace of the RKHS, and will be low-rank while capturing the statistical dependency of the response on the covariates. This affords significant improvements in computational efficiency as well as potential reduction in the variance of predictions. We develop a fast Expectation-Maximization algorithm with improved computational complexity for estimating the parameters of the subspace-induced Gaussian process (SIGP). Extensive results on real-life data show that SIGP achieves competitive performance with a low-rank inducing subspace.