An optimal transport problem on finite spaces is a linear program. Recently, a relaxation of the optimal transport problem via strictly convex functions, especially via the Kullback--Leibler divergence, sheds new light on data sciences. This paper provides the mathematical foundations and an iterative process based on a gradient descent for the relaxed optimal transport problem via Bregman divergences.