Projects

Topics in Mathematics Münster

Deep learning and surrogate methods

Current Cluster Publications

Current Cluster Publications of Dr. Benno Kuckuck

$\bullet $ Lukas Gonon, Arnulf Jentzen, Benno Kuckuck, Siyu Liang, Adrian Riekert, and Philippe von Wurstemberger. An overview on machine learning methods for partial differential equations: from physics informed neural networks to deep operator learning. arXiv e-prints, August 2024. arXiv:2408.13222.

$\bullet $ Julia Ackermann, Arnulf Jentzen, Benno Kuckuck, and Joshua Lee Padgett. Deep neural networks with ReLU, leaky ReLU, and softplus activation provably overcome the curse of dimensionality for space-time solutions of semilinear partial differential equations. arXiv e-prints, June 2024. arXiv:2406.10876.

$\bullet $ Arnulf Jentzen, Benno Kuckuck, and Philippe von Wurstemberger. Mathematical introduction to deep learning: methods, implementations, and theory. arXiv e-prints, October 2023. arXiv:2310.20360.

$\bullet $ Julia Ackermann, Arnulf Jentzen, Thomas Kruse, Benno Kuckuck, and Joshua Lee Padgett. Deep neural networks with ReLU, leaky ReLU, and softplus activation provably overcome the curse of dimensionality for Kolmogorov partial differential equations with Lipschitz nonlinearities in the $L^p$-sense. arXiv e-prints, September 2023. arXiv:2309.13722.

$\bullet $ Christian Beck, Martin Hutzenthaler, Arnulf Jentzen, and Benno Kuckuck. An overview on deep learning-based approximation methods for partial differential equations. Discrete and Continuous Dynamical Systems - B, 28(6):3697–3746, June 2023. doi:10.3934/dcdsb.2022238.

$\bullet $ Arnulf Jentzen, Benno Kuckuck, Thomas Müller-Gronbach, and Larisa Yaroslavtseva. Counterexamples to local Lipschitz and local Hölder continuity with respect to the initial values for additive noise driven stochastic differential equations with smooth drift coefficient functions with at most polynomially growing derivatives. Discrete Contin. Din. Syst. Ser. B, 27(7):3707–3724, July 2022. doi:10.3934/dcdsb.2021203.

$\bullet $ Christian Beck, Arnulf Jentzen, and Benno Kuckuck. Full error analysis for the training of deep neural networks. Infin. Dimens. Anal. Quantum Probab. Relat. Top., 25(2):2150020, June 2022. doi:10.1142/S021902572150020X.

$\bullet $ Victor Boussange, Sebastian Becker, Arnulf Jentzen, Benno Kuckuck, and Loïc Pellissier. Deep learning approximations for non-local nonlinear PDEs with Neumann boundary conditions. arXiv e-prints, May 2022. arXiv:2205.03672.

$\bullet $ Pierfrancesco Beneventano, Patrick Cheridito, Robin Graeber, Arnulf Jentzen, and Benno Kuckuck. Deep neural network approximation theory for high-dimensional functions. arXiv e-prints, December 2021. arXiv:2112.14523.

$\bullet $ Martin Hutzenthaler, Arnulf Jentzen, Benno Kuckuck, and Joshua Lee Padgett. Strong $L^p$-error analysis of nonlinear Monte Carlo approximations for high-dimensional semilinear partial differential equations. arXiv e-prints, October 2021. arXiv:2110.08297.

$\bullet $ Arnulf Jentzen, Benno Kuckuck, Thomas Müller-Gronbach, and Larisa Yaroslavtseva. On the strong regularity of degenerate additive noise driven stochastic differential equations with respect to their initial values. J. Math. Anal. Appl., 502(2):125240, October 2021. doi:10.1016/j.jmaa.2021.125240.

$\bullet $ Arnulf Jentzen, Benno Kuckuck, Ariel Neufeld, and Philippe von Wurstemberger. Strong error analysis for stochastic gradient descent optimization algorithms. IMA J. Numer. Anal., 41(1):455–492, January 2021. doi:10.1093/imanum/drz055.