Abstract:High resolution tactile sensing has great potential in autonomous mobile robotics, particularly for legged robots. One particular area where it has significant promise is the traversal of challenging, varied terrain. Depending on whether an environment is slippery, soft, hard or dry, a robot must adapt its method of locomotion accordingly. Currently many multi-legged robots, such as Boston Dynamic's Spot robot, have preset gaits for different surface types, but struggle over terrains where the surface type changes frequently. Being able to automatically detect changes within an environment would allow a robot to autonomously adjust its method of locomotion to better suit conditions, without requiring a human user to manually set the change in surface type. In this paper we report on the first detailed investigation of the properties of a particular bio-inspired tactile sensor, the TacTip, to test its suitability for this kind of automatic detection of surface conditions. We explored different processing techniques and a regression model, using a custom made rig for data collection to determine how a robot could sense directional and general force on the sensor in a variety of conditions. This allowed us to successfully demonstrate how the sensor can be used to distinguish between soft, hard, dry and (wet) slippery surfaces. We further explored a neural model to classify specific surface textures. Pin movement (the movement of optical markers within the sensor) was key to sensing this information, and all models relied on some form of temporal information. Our final trained models could successfully determine the direction the sensor is heading in, the amount of force acting on it, and determine differences in the surface texture such as Lego vs smooth hard surface, or concrete vs smooth hard surface.