We introduce a variable importance measure to explain the importance of individual variables to a decision made by a black box function. Our measure is based on the Shapley value from cooperative game theory. Measures of variable importance usually work by changing the value of one or more variables with the others held fixed and then recomputing the function of interest. That approach is problematic because it can create very unrealistic combinations of predictors that never appear in practice or that were never present when the prediction function was being created. Our cohort refinement Shapley approach measures variable importance without using any data points that were not actually observed.