Acquiring information about noisy expensive black-box functions (computer simulations or physical experiments) is a tremendously challenging problem. Finite computational and financial resources restrict the application of traditional methods for design of experiments. The problem is surmounted by hurdles such as numerical errors and stochastic approximations errors, when the quantity of interest (QoI) in a problem depends on an expensive black-box function. Bayesian optimal design of experiments has been reasonably successful in guiding the designer towards the QoI for problems of the above kind. This is usually achieved by sequentially querying the function at designs selected by an infill-sampling criterion compatible with utility theory. However, most current methods are semantically designed to work only on optimizing or inferring the black-box function itself. We aim to construct a heuristic which can unequivocally deal with the above problems irrespective of the QoI. This paper applies the above mentioned heuristic to infer a specific QoI, namely the expectation (expected value) of the function. The Kullback Leibler (KL) divergence is fairly conspicuous among techniques that used to quantify information gain. In this paper, we derive an expression for the expected KL divergence to sequentially infer our QoI. The analytical tractability provided by the Karhunene Loeve expansion around the Gaussian process (GP) representation of the black-box function allows circumvention around numerical issues associated with sample averaging. The proposed methodology can be extended to any QoI, with reasonable assumptions. The proposed method is verified and validated on three synthetic functions with varying levels of complexity and dimensionality. We demonstrate our methodology on a steel wire manufacturing problem.