In statistical inference, uncertainty is unknown and all models are wrong. A person who makes a statistical model and a prior distribution is simultaneously aware that they are fictional and virtual candidates. In order to study such cases, several statistical measures have been constructed, such as cross validation, information criteria, and marginal likelihood, however, their mathematical properties have not yet been completely clarified when statistical models are under- and over- parametrized. In this paper, we introduce a place of mathematical theory of Bayesian statistics for unknown uncertainty, on which we show general properties of cross validation, information criteria, and marginal likelihood. The derived theory holds even if an unknown uncertainty is unrealizable by a statistical model or even if the posterior distribution cannot be approximated by any normal distribution, hence it gives a helpful standpoint for a person who cannot believe in any specific model and prior. The results are followings. (1) There exists a more precise statistical measure of the generalization loss than leave-one-out cross validation and information criterion based on the mathematical properties of them. (2) There exists a more efficient approximation method of the free energy, which is the minus log marginal likelihood, even if the posterior distribution cannot be approximated by any normal distribution. (3) And the prior distributions optimized by the cross validation and the widely applicable information criterion are asymptotically equivalent to each other, which are different from that by the marginal likelihood.