We introduce an online prototype-based learning algorithm for clustering and classification, based on the principles of deterministic annealing. We show that the proposed algorithm constitutes a competitive-learning neural network, the learning rule of which is formulated as an online stochastic approximation algorithm. The annealing nature of the algorithm prevents poor local minima, offers robustness with respect to the initial conditions, and provides a means to progressively increase the complexity of the learning model as needed, through an intuitive bifurcation phenomenon. As a result, the proposed approach is interpretable, requires minimal hyper-parameter tuning, and offers online control over the complexity-accuracy trade-off. Finally, Bregman divergences are used as a family of dissimilarity measures that are shown to play an important role in both the performance of the algorithm, and its computational complexity. We illustrate the properties and evaluate the performance of the proposed learning algorithm in artificial and real datasets.