In online review sites, the analysis of user feedback for assessing its helpfulness for decision-making is usually carried out by locally studying the properties of individual reviews. However, global properties should be considered as well to precisely evaluate the quality of user feedback. In this paper we investigate the role of deviations in the properties of reviews as helpfulness determinants with the intuition that "out of the core" feedback helps item evaluation. We propose a novel helpfulness estimation model that extends previous ones with the analysis of deviations in rating, length and polarity with respect to the reviews written by the same person, or concerning the same item. A regression analysis carried out on two large datasets of reviews extracted from Yelp social network shows that user-based deviations in review length and rating clearly influence perceived helpfulness. Moreover, an experiment on the same datasets shows that the integration of our helpfulness estimation model improves the performance of a collaborative recommender system by enhancing the selection of high-quality data for rating estimation. Our model is thus an effective tool to select relevant user feedback for decision-making.