Class imbalance is prevalent in real-world node classification tasks and often biases graph learning models toward majority classes. Most existing studies root from a node-centric perspective and aim to address the class imbalance in training data by node/class-wise reweighting or resampling. In this paper, we approach the source of the class-imbalance bias from an under-explored topology-centric perspective. Our investigation reveals that beyond the inherently skewed training class distribution, the graph topology also plays an important role in the formation of predictive bias: we identify two fundamental challenges, namely ambivalent and distant message-passing, that can exacerbate the bias by aggravating majority-class over-generalization and minority-class misclassification. In light of these findings, we devise a lightweight topological augmentation method ToBA to dynamically rectify the nodes influenced by ambivalent/distant message-passing during graph learning, so as to mitigate the class-imbalance bias. We highlight that ToBA is a model-agnostic, efficient, and versatile solution that can be seamlessly combined with and further boost other imbalance-handling techniques. Systematic experiments validate the superior performance of ToBA in both promoting imbalanced node classification and mitigating the prediction bias between different classes.