ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2112.11602
11
7

Causal Inference Despite Limited Global Confounding via Mixture Models

22 December 2021
Spencer Gordon
Bijan Mazaheri
Y. Rabani
Leonard J. Schulman
    CML
ArXivPDFHTML
Abstract

A Bayesian Network is a directed acyclic graph (DAG) on a set of nnn random variables (the vertices); a Bayesian Network Distribution (BND) is a probability distribution on the random variables that is Markovian on the graph. A finite kkk-mixture of such models is graphically represented by a larger graph which has an additional ``hidden'' (or ``latent'') random variable UUU, ranging in {1,…,k}\{1,\ldots,k\}{1,…,k}, and a directed edge from UUU to every other vertex. Models of this type are fundamental to causal inference, where UUU models an unobserved confounding effect of multiple populations, obscuring the causal relationships in the observable DAG. By solving the mixture problem and recovering the joint probability distribution with UUU, traditionally unidentifiable causal relationships become identifiable. Using a reduction to the more well-studied ``product'' case on empty graphs, we give the first algorithm to learn mixtures of non-empty DAGs.

View on arXiv
Comments on this paper