Classical variational inference for Bayesian neural networks (BNNs) in parameter space usually suffers from unresolved prior issues such as knowledge encoding intractability and pathological behaviors in deep networks, which could lead to an improper posterior inference. Hence, functional variational inference has been proposed recently to resolve these issues via stochastic process priors. Beyond variational inference, stochastic gradient Markov Chain Monte Carlo (SGMCMC) is another scalable and effective inference method for BNNs to asymptotically generate samples from true posterior by simulating a continuous dynamic. However, the existing SGMCMC methods only work in parametric space, which has the same issues of parameter-space variational inference, and extending the parameter-space dynamics to function-space dynamics is not a trivial undertaking. In this paper, we introduce a new functional SGMCMC scheme via newly designed diffusion dynamics, which can incorporate more informative functional priors. Moreover, we prove that the stationary distribution of these functional dynamics is the target posterior distribution over functions. We demonstrate better performance in both accuracy and uncertainty quantification of our functional SGMCMC on several tasks compared with naive SGMCMC and functional variational inference methods.