Neural Architecture Search (NAS) has received increasing attention because of its exceptional merits in automating the design of Deep Neural Network (DNN) architectures. However, the performance evaluation process, as a key part of NAS, often requires training a large number of DNNs. This inevitably causes NAS computationally expensive. In past years, many Efficient Evaluation Methods (EEMs) have been proposed to address this critical issue. In this paper, we comprehensively survey these EEMs published up to date, and provide a detailed analysis to motivate the further development of this research direction. Specifically, we divide the existing EEMs into four categories based on the number of DNNs trained for constructing these EEMs. The categorization can reflect the degree of efficiency in principle, which can in turn help quickly grasp the methodological features. In surveying each category, we further discuss the design principles and analyze the strength and weaknesses to clarify the landscape of existing EEMs, thus making easily understanding the research trends of EEMs. Furthermore, we also discuss the current challenges and issues to identify future research directions in this emerging topic. To the best of our knowledge, this is the first work that extensively and systematically surveys the EEMs of NAS.