Bayesian inference without the access of likelihood, called likelihood-free inference, is highlighted in simulation to yield a more realistic simulation result. Recent research updates an approximate posterior sequentially with the cumulative simulation input-output pairs over inference rounds. This paper observes that previous algorithms with Monte-Carlo Markov Chain present low accuracy for inference on a simulation with a multi-modal posterior due to the mode collapse of MCMC. From the observation, we propose an implicit sampling method, Implicit Surrogate Proposal (ISP), to draw balanced simulation inputs at each round. The resolution of mode collapse comes from two mechanisms: 1) a flexible surrogate proposal density estimator and 2) a parallel explored samples to train the surrogate density model. We demonstrate that ISP outperforms the baseline algorithms in multi-modal simulations.