Emulation of CPU-demanding reactive transport models: a comparison of Gaussian processes, polynomial chaos expansion, and deep neural networks

Research output: Contribution to journalArticle

Standard

Emulation of CPU-demanding reactive transport models: a comparison of Gaussian processes, polynomial chaos expansion, and deep neural networks. / Laloy, Eric; Jacques, Diederik.

In: Computational Geosciences, Vol. 23, No. 5, 01.10.2019, p. 1193-1215.

Research output: Contribution to journalArticle

Bibtex - Download

@article{d72c229482234ecc907286943f7ce0c1,
title = "Emulation of CPU-demanding reactive transport models: a comparison of Gaussian processes, polynomial chaos expansion, and deep neural networks",
abstract = "Simulating the fate and transport behavior of radionuclides and other reactive solutes in the vadose zone and aquifers requires reactive transport models (RTMs). These RTMs can be rather computationally demanding and any task that necessitates many RTM runs may benefit from the construction of an emulator or “surrogate” model. Here we present a detailed benchmarking of 3 methods for the non-intrusive emulation of moderately low-dimensional (that is, 8 to 13-dimensional) CPU-intensive reactive transport models: Gaussian processes (GP), polynomial chaos expansion (PCE) and deep neural networks (DNNs). State-of-the-art open-source libraries are used for each emulation method while the CPU-time incurred by one forward run of the considered two RTMs varies from 1h to between 1h30 and 5 days. Using distributed computing, these large computational demands limit the offline creation of training examples to at most 500 samples. Furthermore, we consider four emulation-based tasks: (1) direct or plain emulation, (2) global sensitivity analysis (GSA), (3) uncertainty propagation (UP), and (4) probabilistic calibration or inversion. Overall, our selected DNN is found to outperform GP and PCE for plain emulation, GSA, and UP. This even though the used training sets are only of size 75 to 500. Most surprisingly, despite its superior emulation capabilities the chosen DNN is the worst performing method for the considered synthetic inverse problem which involves 1224 measurement data with low noise. This is at least partially caused by the (very) small but complex deterministic noise that plagues the DNN-based predictions. This complicated bias can indeed drive the emulated solutions far away from the true solution when the available measurement data are of high quality. Among the considered 3 methods only GP allows for finding emulated posterior solutions that simultaneously (1) fit the synthetic high-quality measurement data to the correct noise levels and (2) most closely approximate the true model parameter values.",
keywords = "Emulation, Machine learning, Surrogate model, Reactive transport models, Gaussian processes, Polynomial chaos expansion, Deep neural networks, Sensitivity analysis, Uncertainty propagation, Inverse modeling",
author = "Eric Laloy and Diederik Jacques",
note = "Score=10",
year = "2019",
month = "10",
day = "1",
doi = "10.1007/s10596-019-09875-y",
language = "English",
volume = "23",
pages = "1193--1215",
journal = "Computational Geosciences",
issn = "1420-0597",
publisher = "Kluwer Academic Publishers",
number = "5",

}

RIS - Download

TY - JOUR

T1 - Emulation of CPU-demanding reactive transport models: a comparison of Gaussian processes, polynomial chaos expansion, and deep neural networks

AU - Laloy, Eric

AU - Jacques, Diederik

N1 - Score=10

PY - 2019/10/1

Y1 - 2019/10/1

N2 - Simulating the fate and transport behavior of radionuclides and other reactive solutes in the vadose zone and aquifers requires reactive transport models (RTMs). These RTMs can be rather computationally demanding and any task that necessitates many RTM runs may benefit from the construction of an emulator or “surrogate” model. Here we present a detailed benchmarking of 3 methods for the non-intrusive emulation of moderately low-dimensional (that is, 8 to 13-dimensional) CPU-intensive reactive transport models: Gaussian processes (GP), polynomial chaos expansion (PCE) and deep neural networks (DNNs). State-of-the-art open-source libraries are used for each emulation method while the CPU-time incurred by one forward run of the considered two RTMs varies from 1h to between 1h30 and 5 days. Using distributed computing, these large computational demands limit the offline creation of training examples to at most 500 samples. Furthermore, we consider four emulation-based tasks: (1) direct or plain emulation, (2) global sensitivity analysis (GSA), (3) uncertainty propagation (UP), and (4) probabilistic calibration or inversion. Overall, our selected DNN is found to outperform GP and PCE for plain emulation, GSA, and UP. This even though the used training sets are only of size 75 to 500. Most surprisingly, despite its superior emulation capabilities the chosen DNN is the worst performing method for the considered synthetic inverse problem which involves 1224 measurement data with low noise. This is at least partially caused by the (very) small but complex deterministic noise that plagues the DNN-based predictions. This complicated bias can indeed drive the emulated solutions far away from the true solution when the available measurement data are of high quality. Among the considered 3 methods only GP allows for finding emulated posterior solutions that simultaneously (1) fit the synthetic high-quality measurement data to the correct noise levels and (2) most closely approximate the true model parameter values.

AB - Simulating the fate and transport behavior of radionuclides and other reactive solutes in the vadose zone and aquifers requires reactive transport models (RTMs). These RTMs can be rather computationally demanding and any task that necessitates many RTM runs may benefit from the construction of an emulator or “surrogate” model. Here we present a detailed benchmarking of 3 methods for the non-intrusive emulation of moderately low-dimensional (that is, 8 to 13-dimensional) CPU-intensive reactive transport models: Gaussian processes (GP), polynomial chaos expansion (PCE) and deep neural networks (DNNs). State-of-the-art open-source libraries are used for each emulation method while the CPU-time incurred by one forward run of the considered two RTMs varies from 1h to between 1h30 and 5 days. Using distributed computing, these large computational demands limit the offline creation of training examples to at most 500 samples. Furthermore, we consider four emulation-based tasks: (1) direct or plain emulation, (2) global sensitivity analysis (GSA), (3) uncertainty propagation (UP), and (4) probabilistic calibration or inversion. Overall, our selected DNN is found to outperform GP and PCE for plain emulation, GSA, and UP. This even though the used training sets are only of size 75 to 500. Most surprisingly, despite its superior emulation capabilities the chosen DNN is the worst performing method for the considered synthetic inverse problem which involves 1224 measurement data with low noise. This is at least partially caused by the (very) small but complex deterministic noise that plagues the DNN-based predictions. This complicated bias can indeed drive the emulated solutions far away from the true solution when the available measurement data are of high quality. Among the considered 3 methods only GP allows for finding emulated posterior solutions that simultaneously (1) fit the synthetic high-quality measurement data to the correct noise levels and (2) most closely approximate the true model parameter values.

KW - Emulation

KW - Machine learning

KW - Surrogate model

KW - Reactive transport models

KW - Gaussian processes

KW - Polynomial chaos expansion

KW - Deep neural networks

KW - Sensitivity analysis

KW - Uncertainty propagation

KW - Inverse modeling

UR - http://ecm.sckcen.be/OTCS/llisapi.dll/open/36098080

U2 - 10.1007/s10596-019-09875-y

DO - 10.1007/s10596-019-09875-y

M3 - Article

VL - 23

SP - 1193

EP - 1215

JO - Computational Geosciences

JF - Computational Geosciences

SN - 1420-0597

IS - 5

ER -

ID: 5724739