In addition to the high cost and complex setup, the main reason for the limitation of the three-dimensional (3D) display is the problem of accurately estimating the user's current point-of-gaze (PoG) in a 3D space. In this paper, we present a novel noncontact technique for the PoG estimation in a stereoscopic environment, which integrates a 3D stereoscopic display system and an eye-tracking system. The 3D stereoscopic display system can provide users with a friendly and immersive high-definition viewing experience without wearing any equipment. To accurately locate the user's 3D PoG in the field of view, we build a regression-based 3D eye-tracking model with the eye movement data and stereo stimulus videos as input. Besides, to train an optimal regression model, we also design and annotate a dataset that contains 30 users' eye-tracking data corresponding to two designed stereo test scenes. Innovatively, this dataset introduces feature vectors between eye region landmarks for the gaze vector estimation and a combined feature set for the gaze depth estimation. Moreover, five traditional regression models are trained and evaluated based on this dataset. Experimental results show that the average errors of the 3D PoG are about 0.90~cm on the X-axis, 0.83~cm on the Y-axis, and 1.48~cm$/$0.12~m along the Z-axis with the scene-depth range in 75~cm$/$8~m, respectively.