Publikationsrepositorium - Helmholtz-Zentrum Dresden-Rossendorf
1 PublikationPolynomial differentiation decreases the training time complexity of physics-informed neural networks and strengthens their approximation power
Suarez Cardona, J. E.; Hecht, M.
Abstract
We present a novel class of approximations for variational losses, being applicable for the training of physics-informed neural nets (PINNs). The formulations reflect classic Sobolev space theory for partial differential equations and their weak formulations.
The loss computation rests on an extension of \emph{Gauss-Legendre cubatures}, we term \emph{Sobolev cubatures}, replacing \emph{automatic differentiation (A.D.)}. We prove the runtime complexity for training the resulting Sobolev-PINNs (SC-PINNs) to be less than required by PINNs relying on A.D. On top of one-to-two order of magnitude speed-up the SC-PINNs are demonstrated to achieve closer solution approximations for prominent forward and inverse (non-linear) PDE problems compared to established PINNs.
Verknüpfte Publikationen
- ARXIV: 2211.15443 is previous version of this (Id 36069) publication
-
Machine Learning: Science and Technology 4(2023), 045005
DOI: 10.1088/2632-2153/acf97a
Permalink: https://www.hzdr.de/publications/Publ-36069