Empirische Inferenz Conference Paper 2009

Regression by dependence minimization and its application to causal inference in additive noise models

Motivated by causal inference problems, we propose a novel method for regression that minimizes the statistical dependence between regressors and residuals. The key advantage of this approach to regression is that it does not assume a particular distribution of the noise, i.e., it is non-parametric with respect to the noise distribution. We argue that the proposed regression method is well suited to the task of causal inference in additive noise models. A practical disadvantage is that the resulting optimization problem is generally non-convex and can be difficult to solve. Nevertheless, we report good results on one of the tasks of the NIPS 2008 Causality Challenge, where the goal is to distinguish causes from effects in pairs of statistically dependent variables. In addition, we propose an algorithm for efficiently inferring causal models from observational data for more than two variables. The required number of regressions and independence tests is quadratic in the number of variables, which is a significant improvement over the simple method that tests all possible DAGs.

Author(s): Mooij, JM. and Janzing, D. and Peters, J. and Schölkopf, B.
Book Title: Proceedings of the 26th International Conference on Machine Learning
Pages: 745-752
Year: 2009
Month: June
Day: 0
Editors: A Danyluk and L Bottou and M Littman
Publisher: ACM Press
Bibtex Type: Conference Paper (inproceedings)
Address: New York, NY, USA
DOI: 10.1145/1553374.1553470
Event Name: ICML 2009
Event Place: Montreal, Canada
Electronic Archiving: grant_archive
Language: en
Organization: Max-Planck-Gesellschaft
School: Biologische Kybernetik
Links:

BibTex

@inproceedings{5869,
  title = {Regression by dependence minimization and its application to causal inference in additive noise models},
  booktitle = {Proceedings of the 26th International Conference on Machine Learning},
  abstract = {Motivated by causal inference problems, we
  propose a novel method for regression that
  minimizes the statistical dependence between
  regressors and residuals. The key advantage
  of this approach to regression is that it does
  not assume a particular distribution of the
  noise, i.e., it is non-parametric with respect
  to the noise distribution. We argue that the
  proposed regression method is well suited to
  the task of causal inference in additive noise
  models. A practical disadvantage is that the
  resulting optimization problem is generally
  non-convex and can be difficult to solve. Nevertheless,
  we report good results on one of the
  tasks of the NIPS 2008 Causality Challenge,
  where the goal is to distinguish causes from
  effects in pairs of statistically dependent variables.
  In addition, we propose an algorithm
  for efficiently inferring causal models from
  observational data for more than two variables.
  The required number of regressions
  and independence tests is quadratic in the
  number of variables, which is a significant improvement
  over the simple method that tests
  all possible DAGs.},
  pages = {745-752},
  editors = {A Danyluk and L Bottou and M Littman},
  publisher = {ACM Press},
  organization = {Max-Planck-Gesellschaft},
  school = {Biologische Kybernetik},
  address = {New York, NY, USA},
  month = jun,
  year = {2009},
  slug = {5869},
  author = {Mooij, JM. and Janzing, D. and Peters, J. and Sch{\"o}lkopf, B.},
  month_numeric = {6}
}