Success in racing requires a unique combination of vehicle setup, understanding of the racetrack, and human expertise. Since building and testing many different vehicle configurations in the real world is prohibitively expensive, high-fidelity simulation is a critical part of racecar development. However, testing different vehicle configurations still requires expert human input in order to evaluate their performance on different racetracks. In this work, we present the first steps towards an autonomous test driver, trained using deep reinforcement learning, capable of evaluating changes in vehicle setup on racing performance while driving at the level of the best human drivers. In addition, the autonomous driver model can be tuned to exhibit more human-like behavioral patterns by incorporating imitation learning into the RL training process. This extension permits the possibility of driver-specific vehicle setup optimization.