Subsampling of Parametric Models with Bifidelity Boosting

Nuojin Cheng, Osman Asif Malik, Yiming Xu, Stephen Becker, Alireza Doostan, Akil Narayan

Research output: Contribution to journalArticlepeer-review

Abstract

Least squares regression is a ubiquitous tool for building emulators (a.k.a. surrogate models) of problems across science and engineering for purposes such as design space exploration and uncertainty quantification. When the regression data are generated using an experimental design process (e.g., a quadrature grid) involving computationally expensive models, or when the data size is large, sketching techniques have shown promise at reducing the cost of the construction of the regression model while ensuring accuracy comparable to that of the full data. However, random sketching strategies, such as those based on leverage scores, lead to regression errors that are random and may exhibit large variability. To mitigate this issue, we present a novel boosting approach that leverages cheaper, lower-fidelity data of the problem at hand to identify the best sketch among a set of candidate sketches. This in turn specifies the sketch of the intended high-fidelity model and the associated data. We provide theoretical analyses of this bifidelity boosting (BFB) approach and discuss the conditions the low- and high-fidelity data must satisfy for a successful boosting. In doing so, we derive a bound on the residual norm of the BFB sketched solution relating it to its ideal, but computationally expensive, high-fidelity boosted counterpart. Empirical results on both manufactured and PDE data corroborate the theoretical analyses and illustrate the efficacy of the BFB solution in reducing the regression error, as compared to the nonboosted solution.

Original languageEnglish
Pages (from-to)213-241
Number of pages29
JournalSIAM-ASA Journal on Uncertainty Quantification
Volume12
Issue number2
DOIs
StatePublished - Jun 2024

Bibliographical note

Publisher Copyright:
© 2024 Society for Industrial and Applied Mathematics and American Statistical Association.

Keywords

  • boosting
  • least squares
  • multifidelity
  • polynomial chaos
  • randomized sketching
  • uncertainty quantification

ASJC Scopus subject areas

  • Statistics and Probability
  • Modeling and Simulation
  • Statistics, Probability and Uncertainty
  • Discrete Mathematics and Combinatorics
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Subsampling of Parametric Models with Bifidelity Boosting'. Together they form a unique fingerprint.

Cite this