419

Generalized Balancing Weights via Deep Neural Networks

Abstract

We present generalized balancing weights, Neural Balancing Weights (NBW), to estimate the causal effects for an arbitrary mixture of discrete and continuous interventions. The weights were obtained by directly estimating the density ratio between the source and balanced distributions by optimizing the variational representation of ff-divergence. For this, we selected α\alpha-divergence since it has good properties for optimization: It has a N\sqrt{N}-consistency estimator and unbiased mini-batch gradients and is advantageous for the vanishing gradient problem. In addition, we provide a method for checking the balance of the distribution changed by the weights. If the balancing is imperfect, the weights can be improved by adding new balancing weights. Our method can be conveniently implemented with any present deep-learning libraries, and weights can be used in most state-of-the-art supervised algorithms.

View on arXiv
Comments on this paper