## DeepONet

swMATH ID: | 42093 |

Software Authors: | Lu Lu, Pengzhan Jin, George Em Karniadakis |

Description: | DeepONet: Learning nonlinear operators for identifying differential equations based on the universal approximation theorem of operators. While it is widely known that neural networks are universal approximators of continuous functions, a less known and perhaps more powerful result is that a neural network with a single hidden layer can approximate accurately any nonlinear continuous operator. This universal approximation theorem is suggestive of the potential application of neural networks in learning nonlinear operators from data. However, the theorem guarantees only a small approximation error for a sufficient large network, and does not consider the important optimization and generalization errors. To realize this theorem in practice, we propose deep operator networks (DeepONets) to learn operators accurately and efficiently from a relatively small dataset. A DeepONet consists of two sub-networks, one for encoding the input function at a fixed number of sensors xi,i=1,…,m (branch net), and another for encoding the locations for the output functions (trunk net). We perform systematic simulations for identifying two types of operators, i.e., dynamic systems and partial differential equations, and demonstrate that DeepONet significantly reduces the generalization error compared to the fully-connected networks. We also derive theoretically the dependence of the approximation error in terms of the number of sensors (where the input function is defined) as well as the input function type, and we verify the theorem with computational results. More importantly, we observe high-order error convergence in our computational tests, namely polynomial rates (from half order to fourth order) and even exponential convergence with respect to the training dataset size. |

Homepage: | https://arxiv.org/abs/1910.03193 |

Source Code: | https://github.com/lululxvi/deeponet |

Related Software: | Adam; PDE-Net; DeepXDE; PyTorch; DGM; FPINNs; PINNsNTK; TensorFlow; ImageNet; XPINNs; NSFnets; GitHub; DiffSharp; AlexNet; Chebfun; torchdiffeq; MgNet; L-BFGS; Wasserstein GAN; JAX |

Cited in: | 59 Publications |

all
top 5

### Cited by 170 Authors

all
top 5

### Cited in 20 Serials

all
top 5