The problem of Shannon entropy estimation in countable infinite alphabets is addressed from the study and use of convergence results of the entropy functional, which is known to be discontinuous with respect to the total variation distance in $\infty$-alphabets. Sufficient conditions for the convergence of the entropy are used, including scenarios with both finitely and infinitely supported assumptions on the distributions. From this new perspective, four plug-in histogram-based estimators are studied showing that convergence results are instrumental to derive new strong consistency and rate of convergences results. Different scenarios and conditions are used on both the estimators and the underlying distribution, considering for example finite and unknown supported assumptions and summable tail bounded conditions.