We present an efficient feature selection method that can find all multiplicative combinations of continuous features that are statistically significantly associated with the class variable, while rigorously correcting for multiple testing. The key to overcome the combinatorial explosion in the number of candidates is to derive a lower bound on the $p$-value for each feature combination, which enables us to massively prune combinations that can never be significant and gain more statistical power. While this problem has been addressed for binary features in the past, we here present the first solution for continuous features. In our experiments, our novel approach detects true feature combinations with higher precision and recall than competing methods that require a prior binarization of the data.