Gradient descent computed by backpropagation (BP) is a widely used learning method for training artificial neural networks but has several limitations: it is computationally demanding, requires frequent manual tuning of the network architecture, and is prone to catastrophic forgetting when learning incrementally. To address these issues, we introduce a brain-mimetic developmental spiking neural network (BDNN) that mimics the postnatal development of neural circuits. We validate its performance through a neuromorphic tactile system capable of learning to recognize objects through grasping. Unlike traditional BP-based methods, BDNN exhibits strong knowledge transfer, supporting efficient incremental learning of new tactile information. It requires no hyperparameter tuning and dynamically adapts to incoming data. Moreover, compared to the BP-based counterpart, it achieves classification accuracy on par with BP while learning over ten times faster in ideal conditions and up to two or three orders of magnitude faster in practical settings. These features make BDNN well-suited for fast data processing on edge devices.