To effectively retrieve objects from large corpus with high accuracy is a challenge task. In this paper, we propose a method that propagates visual feature level similarities on a Markov random field (MRF) to obtain a high level correspondence in image space for image pairs. The proposed correspondence between image pair reflects not only the similarity of low-level visual features but also the relations built through other images in the database and it can be easily integrated into the existing bag-of-visual-words(BoW) based systems to reduce the missing rate. We evaluate our method on the standard Oxford-5K, Oxford-105K and Paris-6K dataset. The experiment results show that the proposed method significantly improves the retrieval accuracy on three datasets and exceeds the current state-of-the-art retrieval performance.