Reduced basis approximations of parameterized dynamical partial differential equations via neural networks

Peter Sentz, Kristian Beckwith, Eric C. Cyr, Luke N. Olson, Ravi Patel

Research output: Contribution to journalArticlepeer-review

Abstract

Projection-based reduced order models are effective at approximating parameter-dependent differential equations that are parametrically separable. When parametric separability is not satisfied, which occurs in both linear and nonlinear problems, projection-based methods fail to adequately reduce the computational complexity. Devising alternative reduced order models is crucial for obtaining efficient and accurate approximations to expensive high-fidelity models. In this work, we develop a timestepping procedure for dynamical parameter-dependent problems, in which a neural-network is trained to propagate the coefficients of a reduced basis expansion. This results in an online stage with a computational cost independent of the size of the underlying problem. We demonstrate our method on several parabolic partial differential equations, including a problem that is not parametrically separable.

Original languageEnglish (US)
Pages (from-to)338-362
Number of pages25
JournalFoundations of Data Science
Volume7
Issue number1
DOIs
StatePublished - Mar 2025

Keywords

  • Finite elements
  • Neural networks
  • Parameterized partial differential equations
  • Proper orthogonal decomposition
  • Reduced basis methods

ASJC Scopus subject areas

  • Analysis
  • Statistics and Probability
  • Computational Theory and Mathematics
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Reduced basis approximations of parameterized dynamical partial differential equations via neural networks'. Together they form a unique fingerprint.

Cite this