Learning Subspaces of Different Dimension

Abstract
We introduce a Bayesian model for inferring mixtures of subspaces of different dimensions. The key challenge in such a model is specifying prior distributions over subspaces of different dimensions. We address this challenge by embedding subspaces or Grassmann manifolds into a sphere of relatively low dimension and specifying priors on the sphere. We provide an efficient sampling algorithm for the posterior distribution of the model parameters. We also prove posterior consistency of our procedure. The utility of this approach is demonstrated with applications to real and simulated data.
View on arXivComments on this paper