ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1902.10527
12
12

Deep active subspaces - a scalable method for high-dimensional uncertainty propagation

27 February 2019
Rohit Tripathy
Ilias Bilionis
ArXivPDFHTML
Abstract

A problem of considerable importance within the field of uncertainty quantification (UQ) is the development of efficient methods for the construction of accurate surrogate models. Such efforts are particularly important to applications constrained by high-dimensional uncertain parameter spaces. The difficulty of accurate surrogate modeling in such systems, is further compounded by data scarcity brought about by the large cost of forward model evaluations. Traditional response surface techniques, such as Gaussian process regression (or Kriging) and polynomial chaos are difficult to scale to high dimensions. To make surrogate modeling tractable in expensive high-dimensional systems, one must resort to dimensionality reduction of the stochastic parameter space. A recent dimensionality reduction technique that has shown great promise is the method of `active subspaces'. The classical formulation of active subspaces, unfortunately, requires gradient information from the forward model - often impossible to obtain. In this work, we present a simple, scalable method for recovering active subspaces in high-dimensional stochastic systems, without gradient-information that relies on a reparameterization of the orthogonal active subspace projection matrix, and couple this formulation with deep neural networks. We demonstrate our approach on synthetic and real world datasets and show favorable predictive comparison to classical active subspaces.

View on arXiv
Comments on this paper