ResearchTrend.AI
  • Papers
  • Communities
  • Organizations
  • Events
  • Blog
  • Pricing
  • Feedback
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1702.07830
  4. Cited By
A Near-Optimal Sampling Strategy for Sparse Recovery of Polynomial Chaos
  Expansions
v1v2 (latest)

A Near-Optimal Sampling Strategy for Sparse Recovery of Polynomial Chaos Expansions

25 February 2017
Negin Alemazkoor
Hadi Meidani
ArXiv (abs)PDFHTML

Papers citing "A Near-Optimal Sampling Strategy for Sparse Recovery of Polynomial Chaos Expansions"

5 / 5 papers shown
Title
Multi-fidelity Machine Learning for Uncertainty Quantification and
  Optimization
Multi-fidelity Machine Learning for Uncertainty Quantification and Optimization
Ruda Zhang
Negin Alemazkoor
AI4CE
111
5
0
30 Oct 2024
On efficient algorithms for computing near-best polynomial
  approximations to high-dimensional, Hilbert-valued functions from limited
  samples
On efficient algorithms for computing near-best polynomial approximations to high-dimensional, Hilbert-valued functions from limited samples
Ben Adcock
Simone Brugiapaglia
N. Dexter
S. Moraga
132
11
0
25 Mar 2022
Sparse Polynomial Chaos Expansions: Literature Survey and Benchmark
Sparse Polynomial Chaos Expansions: Literature Survey and Benchmark
Nora Lüthen
S. Marelli
Bruno Sudret
169
171
0
04 Feb 2020
Sparse Polynomial Chaos Expansions via Compressed Sensing and D-optimal
  Design
Sparse Polynomial Chaos Expansions via Compressed Sensing and D-optimal Design
Paul Diaz
Alireza Doostan
Jerrad Hampton
106
102
0
29 Dec 2017
A preconditioning approach for improved estimation of sparse polynomial
  chaos expansions
A preconditioning approach for improved estimation of sparse polynomial chaos expansions
Negin Alemazkoor
Hadi Meidani
88
9
0
22 Sep 2017
1