Abstract:Domain adaptation aims to use training data from one or multiple source domains to learn a hypothesis that can be generalized to a different, but related, target domain. As such, having a reliable measure for evaluating the discrepancy of both marginal and conditional distributions is crucial. We introduce Cauchy-Schwarz (CS) divergence to the problem of unsupervised domain adaptation (UDA). The CS divergence offers a theoretically tighter generalization error bound than the popular Kullback-Leibler divergence. This holds for the general case of supervised learning, including multi-class classification and regression. Furthermore, we illustrate that the CS divergence enables a simple estimator on the discrepancy of both marginal and conditional distributions between source and target domains in the representation space, without requiring any distributional assumptions. We provide multiple examples to illustrate how the CS divergence can be conveniently used in both distance metric- or adversarial training-based UDA frameworks, resulting in compelling performance.
Abstract:This paper investigates an approach to both speed up business decision-making and lower the cost of learning through experimentation by factorizing business policies and employing fractional factorial experimental designs for their evaluation. We illustrate how this method integrates with advances in the estimation of heterogeneous treatment effects, elaborating on its advantages and foundational assumptions. We empirically demonstrate the implementation and benefits of our approach and assess its validity in evaluating consumer promotion policies at DoorDash, which is one of the largest delivery platforms in the US. Our approach discovers a policy with 5% incremental profit at 67% lower implementation cost.