Despite significant advances in touch and force transduction, tactile sensing is still far from ubiquitous in robotic manipulation. Existing methods for building touch sensors have proven difficult to integrate into robot fingers due to multiple challenges, including difficulty in covering multicurved surfaces, high wire count, or packaging constrains preventing their use in dexterous hands. In this paper, we present a multicurved robotic finger with accurate touch localization and normal force detection over complex, three-dimensional surfaces. The key to our approach is the novel use of overlapping signals from light emitters and receivers embedded in a transparent waveguide layer that covers the functional areas of the finger. By measuring light transport between every emitter and receiver, we show that we can obtain a very rich signal set that changes in response to deformation of the finger due to touch. We then show that purely data-driven deep learning methods are able to extract useful information from such data, such as contact location and applied normal force, without the need for analytical models. The final result is a fully integrated, sensorized robot finger, with a low wire count and using easily accessible manufacturing methods, designed for easy integration into dexterous manipulators.