Perceptual processes are frequently multi-modal. This is the case of haptic perception. Data sets of visual and haptic sensory signals have been compiled in the past, especially when it comes to the exploration of textured surfaces. These data sets were intended to be used in natural and artificial perception studies and to provide training data sets for machine learning research. These data sets were typically acquired with rigid probes or artificial robotic fingers. Here, we collected visual, auditory, and haptic signals acquired when a human finger explored textured surfaces. We assessed the data set via machine learning classification techniques. Interestingly, multi-modal classification performance could reach 97% when haptic classification was around 80%.