Abstract
Projection-based reduced order models are effective at approximating parameter-dependent differential equations that are parametrically separable. When parametric separability is not satisfied, which occurs in both linear and nonlinear problems, projection-based methods fail to adequately reduce the computational complexity. Devising alternative reduced order models is crucial for obtaining efficient and accurate approximations to expensive high-fidelity models. In this work, we develop a timestepping procedure for dynamical parameter-dependent problems, in which a neural-network is trained to propagate the coefficients of a reduced basis expansion. This results in an online stage with a computational cost independent of the size of the underlying problem. We demonstrate our method on several parabolic partial differential equations, including a problem that is not parametrically separable.
Original language | English (US) |
---|---|
Pages (from-to) | 338-362 |
Number of pages | 25 |
Journal | Foundations of Data Science |
Volume | 7 |
Issue number | 1 |
DOIs | |
State | Published - Mar 2025 |
Keywords
- Finite elements
- Neural networks
- Parameterized partial differential equations
- Proper orthogonal decomposition
- Reduced basis methods
ASJC Scopus subject areas
- Analysis
- Statistics and Probability
- Computational Theory and Mathematics
- Applied Mathematics