In this paper, we present a new idea for Transfer Learning (TL) based on Gibbs Sampling. Gibbs sampling is an algorithm in which instances are likely to transfer to a new state with a higher possibility with respect to a probability distribution. We find that such an algorithm can be employed to transfer instances between domains. Restricted Boltzmann Machine (RBM) is an energy based model that is very feasible for being trained to represent a data distribution and also for performing Gibbs sampling. We used RBM to capture data distribution of the source domain and use it in order to cast target instances into new data with a distribution similar to the distribution of source data. Using datasets that are commonly used for evaluation of TL methods, we show that our method can successfully enhance target classification by a considerable ratio. Additionally, the proposed method has the advantage over common DA methods that it needs no target data during the process of training of models.