ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2002.00797
63
15
v1v2v3 (latest)

Stochastic geometry to generalize the Mondrian Process

3 February 2020
Eliza O'Reilly
N. Tran
ArXiv (abs)PDFHTML
Abstract

The Mondrian process is a stochastic process that produces a recursive partition of space with random axis-aligned cuts. Random forests and Laplace kernel approximations built from the Mondrian process have led to efficient online learning methods and Bayesian optimization. By viewing the Mondrian process as a special case of the stable under iterated tessellation (STIT) process, we utilize tools from stochastic geometry to resolve some fundamental questions concerning the Mondrian process in machine learning. First, we show that the Mondrian process with general cut directions can be efficiently simulated by lifting to a higher dimensional axis-aligned Mondrian process. Second, we characterize all possible kernels that generalizations of the Mondrian process can approximate with fixed parameters as well as additional kernels obtained from mixtures of STIT processes. This includes, for instance, various forms of the weighted Laplace kernel and the exponential kernel. Lastly, we give an explicit formula for the density estimator arising from a Mondrian forest. This allows for precise comparisons between the Mondrian forest, the Mondrian kernel and the Laplace kernel in density estimation. Our paper calls for further developments at the novel intersection of stochastic geometry and machine learning.

View on arXiv
Comments on this paper