Topics in Mathematics Münster
T7: Field theory and randomnessT9: Multi-scale processes and effective behaviour
T10: Deep learning and surrogate methods
Further Projects
• Overcoming the curse of dimensionality through nonlinear stochastic algorithms: Nonlinear Monte Carlo type methods for high-dimensional approximation problems online
• Dynamical systems and irregular gradient flows online
Research Interests
$\bullet$ Mathematics for machine learning
$\bullet$ Numerical approximations for high-dimensional partial differential equations
$\bullet$ Numerical approximations for stochastic differential equations
$\bullet$ Deep Learning
• Hairer M, Hutzenthaler M, Jentzen A Loss of regularity for Kolmogorov equations. Annals of Probability Vol. 43 (2), 2015, pp 468-527 online
• Hutzenthaler M, Jentzen A On a perturbation theory and on strong convergence rates for stochastic ordinary and partial differential equations with nonglobally monotone coefficients. Annals of Probability Vol. 48 (1), 2020, pp 53-93 online
• Hutzenthaler M, Jentzen A, Kloeden PE Strong and weak divergence in finite time of Euler's method for stochastic differential equations with non-globally Lipschitz continuous coefficients. Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences Vol. 467 (2130), 2011, pp 1563-1576 online
• Fehrman B, Gess B, Jentzen A Convergence rates for the stochastic gradient descent method for non-convex objective functions. Journal of Machine Learning Research Vol. 21, 2020, pp Paper No. 136, 48 online
• Han J, Jentzen A, E W Solving high-dimensional partial differential equations using deep learning. Proceedings of the National Academy of Sciences of the United States of America Vol. 115 (34), 2018, pp 8505-8510 online
• E W, Han J, Jentzen A Deep learning-based numerical methods for high-dimensional parabolic partial differential equations and backward stochastic differential equations. Communications in Mathematics and Statistics Vol. 5 (4), 2017, pp 349-380 online
• Hutzenthaler M, Jentzen A, Kloeden PE Strong convergence of an explicit numerical method for SDEs with nonglobally Lipschitz continuous coefficients. Annals of Applied Probability Vol. 22 (4), 2012, pp 1611-1641 online
• Hutzenthaler M, Jentzen A Numerical approximations of stochastic differential equations with non-globally Lipschitz continuous coefficients. Memoirs of the American Mathematical Society Vol. 236 (1112), 2015, pp v+99 online
• Hutzenthaler M, Jentzen A, Kruse T, Nguyen TA, von Wurstemberger P Overcoming the curse of dimensionality in the numerical approximation of semilinear parabolic partial differential equations. Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences Vol. 476 (2244), 2020, pp 630-654 online
Current Cluster Publications of Prof. Dr. Arnulf Jentzen
$\bullet $ Sonja Cox, Arnulf Jentzen, and Felix Lindner. Weak convergence rates for temporal numerical approximations of the semilinear stochastic wave equation with multiplicative noise. Numer. Math., 156(6):2131–2177, December 2024. doi:10.1007/s00211-024-01425-8.
$\bullet $ Sonja Cox, Arnulf Jentzen, and Felix Lindner. Weak convergence rates for temporal numerical approximations of stochastic wave equations with multiplicative noise. Numer. Math., September 2024. arXiv:1901.05535.
$\bullet $ Patrick Cheridito, Arnulf Jentzen, and Florian Rossmannek. Gradient descent provably escapes saddle points in the training of shallow ReLU networks. Journal of Optimization Theory and Applications, September 2024. doi:10.1007/s10957-024-02513-3.
$\bullet $ Lukas Gonon, Arnulf Jentzen, Benno Kuckuck, Siyu Liang, Adrian Riekert, and Philippe von Wurstemberger. An overview on machine learning methods for partial differential equations: from physics informed neural networks to deep operator learning. arXiv e-prints, August 2024. arXiv:2408.13222.
$\bullet $ Steffen Dereich, Robin Graeber, and Arnulf Jentzen. Non-convergence of Adam and other adaptive stochastic gradient descent optimization methods for non-vanishing learning rates. arXiv e-prints, July 2024. arXiv:2407.08100.
$\bullet $ Steffen Dereich and Arnulf Jentzen. Convergence rates for the Adam optimizer. arXiv e-prints, July 2024. arXiv:2407.21078.
$\bullet $ Julia Ackermann, Arnulf Jentzen, Benno Kuckuck, and Joshua Lee Padgett. Deep neural networks with ReLU, leaky ReLU, and softplus activation provably overcome the curse of dimensionality for space-time solutions of semilinear partial differential equations. arXiv e-prints, June 2024. arXiv:2406.10876.
$\bullet $ Steffen Dereich, Arnulf Jentzen, and Adrian Riekert. Learning rate adaptive stochastic gradient descent optimization methods: numerical simulations for deep learning methods for partial differential equations and convergence analyses. arXiv e-prints, June 2024. arXiv:2406.14340.
$\bullet $ Fabian Hornung, Arnulf Jentzen, and Diyora Salimova. Space-time deep neural network approximations for high-dimensional partial differential equations. Journal of Computational Mathematics, 0(0):0–0, June 2024. doi:10.4208/jcm.2308-m2021-0266.
$\bullet $ Anselm Hudde, Martin Hutzenthaler, Arnulf Jentzen, and Sara Mazzonetto. On the Itô–Alekseev–Gröbner formula for stochastic differential equations. Annales de l’Institut Henri Poincaré, Probabilités et Statistiques, May 2024. doi:10.1214/21-aihp1199.
$\bullet $ Sonja Cox, Martin Hutzenthaler, and Arnulf Jentzen. Local Lipschitz continuity in the initial value and strong completeness for nonlinear stochastic differential equations. Memoirs of the American Mathematical Society, April 2024. doi:10.1090/memo/1481.
$\bullet $ Arnulf Jentzen and Adrian Riekert. Non-convergence to global minimizers for Adam and stochastic gradient descent optimization and constructions of local minimizers in the training of artificial neural networks. arXiv e-prints, February 2024. arXiv:2402.05155.
$\bullet $ Sebastian Becker, Arnulf Jentzen, Marvin S. Müller, and Philippe von Wurstemberger. Learning the random variables in Monte Carlo simulations with stochastic gradient descent: Machine learning for parametric PDEs and financial derivative pricing. Mathematical Finance, 34(1):90–150, January 2024. doi:10.1111/mafi.12405.
$\bullet $ Christian Beck, Sebastian Becker, Patrick Cheridito, Arnulf Jentzen, and Ariel Neufeld. An efficient Monte Carlo scheme for Zakai equations. Communications in Nonlinear Science and Numerical Simulation, 126:107438, November 2023. doi:10.1016/j.cnsns.2023.107438.
$\bullet $ Arnulf Jentzen and Timo Welti. Overall error analysis for the training of deep neural networks via stochastic gradient descent with random initialisation. Applied Mathematics and Computation, 455:127907, October 2023. doi:10.1016/j.amc.2023.127907.
$\bullet $ Arnulf Jentzen, Benno Kuckuck, and Philippe von Wurstemberger. Mathematical introduction to deep learning: methods, implementations, and theory. arXiv e-prints, October 2023. arXiv:2310.20360.
$\bullet $ Julia Ackermann, Arnulf Jentzen, Thomas Kruse, Benno Kuckuck, and Joshua Lee Padgett. Deep neural networks with ReLU, leaky ReLU, and softplus activation provably overcome the curse of dimensionality for Kolmogorov partial differential equations with Lipschitz nonlinearities in the $L^p$-sense. arXiv e-prints, September 2023. arXiv:2309.13722.
$\bullet $ Philipp Grohs, Shokhrukh Ibragimov, Arnulf Jentzen, and Sarah Koppensteiner. Lower bounds for artificial neural network approximations: A proof that shallow neural networks fail to overcome the curse of dimensionality. J. Complexity, 77:101746, 53 pp., August 2023. doi:10.1016/j.jco.2023.101746.
$\bullet $ Martin Hutzenthaler, Arnulf Jentzen, Thomas Kruse, and Tuan Anh Nguyen. Overcoming the curse of dimensionality in the numerical approximation of backward stochastic differential equations. J. Numer. Math., 31(2):1–28, July 2023. doi:10.1515/jnma-2021-0111.
$\bullet $ Christian Beck, Martin Hutzenthaler, Arnulf Jentzen, and Benno Kuckuck. An overview on deep learning-based approximation methods for partial differential equations. Discrete and Continuous Dynamical Systems - B, 28(6):3697–3746, June 2023. doi:10.3934/dcdsb.2022238.
$\bullet $ Philipp Grohs, Fabian Hornung, Arnulf Jentzen, and Philippe von Wurstemberger. A proof that artificial neural networks overcome the curse of dimensionality in the numerical approximation of Black–Scholes partial differential equations. Memoirs of the American Mathematical Society, April 2023. doi:10.1090/memo/1410.
$\bullet $ Sebastian Becker, Benjamin Gess, Arnulf Jentzen, and Peter E. Kloeden. Strong convergence rates for explicit space-time discrete numerical approximations of stochastic Allen-Cahn equations. Stochastics and Partial Differential Equations: Analysis and Computations, 11(1):211–268, April 2023. doi:10.1007/s40072-021-00226-6.
$\bullet $ Simon Eberle, Arnulf Jentzen, Adrian Riekert, and Georg S. Weiss. Existence, uniqueness, and convergence rates for gradient flows in the training of artificial neural networks with ReLU activation. Electron. Res. Arch., 31(5):2519–2554, March 2023. doi:10.3934/era.2023128.
$\bullet $ Christian Beck, Arnulf Jentzen, Konrad Kleinberg, and Thomas Kruse. Nonlinear Monte Carlo methods with polynomial runtime for Bellman equations of discrete time high-dimensional stochastic optimal control problems. arXiv e-prints, March 2023. arXiv:2303.03390.
$\bullet $ Arnulf Jentzen and Adrian Riekert. Strong overall error analysis for the training of artificial neural networks via random initializations. Commun. Math. Stat., pages 50, March 2023. doi:10.1007/s40304-022-00292-9.
$\bullet $ Arnulf Jentzen, Adrian Riekert, and Philippe von Wurstemberger. Algorithmically designed artificial neural networks (ADANNs): Higher order deep operator learning for parametric partial differential equations. arXiv e-prints, February 2023. arXiv:2302.03286.
$\bullet $ Steffen Dereich, Arnulf Jentzen, and Sebastian Kassing. On the existence of minimizers in shallow residual ReLU neural network optimization landscapes. arXiv e-prints, February 2023. arXiv:2302.14690.
$\bullet $ Arnulf Jentzen and Adrian Riekert. Convergence analysis for gradient flows in the training of artificial neural networks with ReLU activation. J. Math. Anal. Appl., 517(2):126601, January 2023. doi:10.1016/j.jmaa.2022.126601.
$\bullet $ Lukas Gonon, Robin Graeber, and Arnulf Jentzen. The necessity of depth for artificial neural networks to approximate certain classes of smooth and bounded functions without the curse of dimensionality. arXiv e-prints, January 2023. arXiv:2301.08284.
$\bullet $ Philipp Grohs, Fabian Hornung, Arnulf Jentzen, and Philipp Zimmermann. Space-time error estimates for deep neural network approximations for differential equations. Advances in Computational Mathematics, 49(1):4, January 2023. doi:10.1007/s10444-022-09970-2.
$\bullet $ Shokhrukh Ibragimov, Arnulf Jentzen, and Adrian Riekert. Convergence to good non-optimal critical points in the training of neural networks: Gradient descent optimization with one random initialization overcomes all bad non-global local minima with high probability. arXiv e-prints, December 2022. arXiv:2212.13111.
$\bullet $ Davide Gallon, Arnulf Jentzen, and Felix Lindner. Blow up phenomena for gradient descent optimization methods in the training of artificial neural networks. arXiv e-prints, November 2022. arXiv:2211.15641.
$\bullet $ Arnulf Jentzen and Adrian Riekert. A proof of convergence for stochastic gradient descent in the training of artificial neural networks with ReLU activation for constant target functions. Z. Angew. Math. Phys., 73(5):188, August 2022. doi:10.1007/s00033-022-01716-w.
$\bullet $ Arnulf Jentzen and Adrian Riekert. A proof of convergence for the gradient descent optimization method with random initializations in the training of neural networks with ReLU activation for piecewise linear target functions. J. Mach. Learn. Res., 23(260):1–50, August 2022.
$\bullet $ Martin Hutzenthaler, Arnulf Jentzen, and Thomas Kruse. Overcoming the curse of dimensionality in the numerical approximation of parabolic partial differential equations with gradient-dependent nonlinearities. Found. Comput. Math., 22(4):905–966, August 2022. doi:10.1007/s10208-021-09514-y.
$\bullet $ Lukas Gonon, Philipp Grohs, Arnulf Jentzen, David Kofler, and David Šiška. Uniform error estimates for artificial neural network approximations for heat equations. IMA J. Numer. Anal., 42(3):1991–2054, July 2022. doi:10.1093/imanum/drab027.
$\bullet $ Simon Eberle, Arnulf Jentzen, Adrian Riekert, and Georg Weiss. Normalized gradient flow optimization in the training of ReLU artificial neural networks. arXiv e-prints, July 2022. arXiv:2207.06246.
$\bullet $ Arnulf Jentzen, Benno Kuckuck, Thomas Müller-Gronbach, and Larisa Yaroslavtseva. Counterexamples to local Lipschitz and local Hölder continuity with respect to the initial values for additive noise driven stochastic differential equations with smooth drift coefficient functions with at most polynomially growing derivatives. Discrete Contin. Din. Syst. Ser. B, 27(7):3707–3724, July 2022. doi:10.3934/dcdsb.2021203.
$\bullet $ Patrick Cheridito, Arnulf Jentzen, and Florian Rossmannek. Landscape analysis for shallow neural networks: Complete classification of critical points for affine target functions. J. Nonlinear Sci., 32(5):64, July 2022. doi:10.1007/s00332-022-09823-8.
$\bullet $ Patrick Cheridito, Arnulf Jentzen, and Florian Rossmannek. Efficient approximation of high-dimensional functions with neural networks. IEEE Trans. Neural Netw. Learn. Syst., 33(7):3079–3093, July 2022. doi:10.1109/TNNLS.2021.3049719.
$\bullet $ Arnulf Jentzen and Timo Kröger. On bounds for norms of reparameterized ReLU artificial neural network parameters: sums of fractional powers of the Lipschitz norm control the network parameter vector. arXiv e-prints, June 2022. arXiv:2206.13646.
$\bullet $ Christian Beck, Arnulf Jentzen, and Benno Kuckuck. Full error analysis for the training of deep neural networks. Infin. Dimens. Anal. Quantum Probab. Relat. Top., 25(2):2150020, June 2022. doi:10.1142/S021902572150020X.
$\bullet $ Philipp Grohs, Arnulf Jentzen, and Diyora Salimova. Deep neural network approximations for solutions of PDEs based on Monte Carlo algorithms. Partial Differ. Equ. Appl., 3(4):Paper No. 45, 41, June 2022. doi:10.1007/s42985-021-00100-z.
$\bullet $ Patrick Cheridito, Arnulf Jentzen, Adrian Riekert, and Florian Rossmannek. A proof of convergence for gradient descent in the training of artificial neural networks for constant target functions. J. Complexity, 72:101646, 26 pp., June 2022. doi:10.1016/j.jco.2022.101646.
$\bullet $ Arnulf Jentzen and Adrian Riekert. On the existence of global minima and convergence analyses for gradient descent methods in the training of deep neural networks. J. Mach. Learn., 1(2):141–246, June 2022. doi:10.4208/jml.220114a.
$\bullet $ Victor Boussange, Sebastian Becker, Arnulf Jentzen, Benno Kuckuck, and Loïc Pellissier. Deep learning approximations for non-local nonlinear PDEs with Neumann boundary conditions. arXiv e-prints, May 2022. arXiv:2205.03672.
$\bullet $ Shokhrukh Ibragimov, Arnulf Jentzen, Timo Kröger, and Adrian Riekert. On the existence of infinitely many realization functions of non-global local minima in the training of artificial neural networks with ReLU activation. arXiv e-prints, February 2022. arXiv:2202.11481.
$\bullet $ Sebastian Becker, Arnulf Jentzen, Marvin S. Müller, and Philippe von Wurstemberger. Learning the random variables in Monte Carlo simulations with stochastic gradient descent: Machine learning for parametric PDEs and financial derivative pricing. arXiv e-prints, February 2022. arXiv:2202.02717.
$\bullet $ Pierfrancesco Beneventano, Patrick Cheridito, Robin Graeber, Arnulf Jentzen, and Benno Kuckuck. Deep neural network approximation theory for high-dimensional functions. arXiv e-prints, December 2021. arXiv:2112.14523.
$\bullet $ Martin Hutzenthaler, Arnulf Jentzen, Katharina Pohl, Adrian Riekert, and Luca Scarpa. Convergence proof for stochastic gradient descent in the training of deep neural networks with ReLU activation for constant target functions. arXiv e-prints, December 2021. arXiv:2112.07369.
$\bullet $ Weinan E, Jiequn Han, and Arnulf Jentzen. Algorithms for solving high dimensional PDEs: From nonlinear Monte Carlo to machine learning. Nonlinearity, 35(1):278–310, December 2021. doi:10.1088/1361-6544/ac337f.
$\bullet $ Ladislas Naurois, Arnulf Jentzen, and Timo Welti. Weak convergence rates for spatial spectral Galerkin approximations of semilinear stochastic wave equations with multiplicative noise. Appl. Math. Optim., 84:1187–1217, November 2021. doi:10.1007/s00245-020-09744-6.
$\bullet $ Weinan E, Martin Hutzenthaler, Arnulf Jentzen, and Thomas Kruse. Multilevel Picard iterations for solving smooth semilinear parabolic heat equations. Partial Differ. Equ. Appl., 2:Paper No. 80, November 2021. doi:10.1007/s42985-021-00089-5.
$\bullet $ Martin Hutzenthaler, Arnulf Jentzen, Benno Kuckuck, and Joshua Lee Padgett. Strong $L^p$-error analysis of nonlinear Monte Carlo approximations for high-dimensional semilinear partial differential equations. arXiv e-prints, October 2021. arXiv:2110.08297.
$\bullet $ Arnulf Jentzen, Benno Kuckuck, Thomas Müller-Gronbach, and Larisa Yaroslavtseva. On the strong regularity of degenerate additive noise driven stochastic differential equations with respect to their initial values. J. Math. Anal. Appl., 502(2):125240, October 2021. doi:10.1016/j.jmaa.2021.125240.
$\bullet $ Christian Beck, Sebastian Becker, Patrick Cheridito, Arnulf Jentzen, and Ariel Neufeld. Deep splitting method for parabolic PDEs. SIAM J. Sci. Comput., 43(5):A3135–A3154, September 2021. doi:10.1137/19M1297919.
$\bullet $ Arnulf Jentzen, Felix Lindner, and Primož Pušnik. Spatial Sobolev regularity for stochastic Burgers equations with additive trace class noise. Nonlinear Anal., 210:112310, September 2021. doi:10.1016/j.na.2021.112310.
$\bullet $ Christian Beck, Lukas Gonon, Martin Hutzenthaler, and Arnulf Jentzen. On existence and uniqueness properties for solutions of stochastic fixed point equations. Discrete Contin. Dyn. Syst. Ser. B, 26(9):4927–4962, September 2021. doi:10.3934/dcdsb.2020320.
$\bullet $ Arnulf Jentzen, Diyora Salimova, and Timo Welti. A proof that deep artificial neural networks overcome the curse of dimensionality in the numerical approximation of Kolmogorov partial differential equations with constant diffusion and nonlinear drift coefficients. Commun. Math. Sci., 19(5):1167–1205, July 2021. doi:10.4310/CMS.2021.v19.n5.a1.
$\bullet $ Christian Beck, Sebastian Becker, Philipp Grohs, Nor Jaafari, and Arnulf Jentzen. Solving the Kolmogorov PDE by means of deep learning. J. Sci. Comput., 88(3):Paper No. 73, 28 pp., July 2021. doi:10.1007/s10915-021-01590-0.
$\bullet $ Christian Beck, Martin Hutzenthaler, and Arnulf Jentzen. On nonlinear Feynman-Kac formulas for viscosity solutions of semilinear parabolic partial differential equations. Stoch. Dyn., 21(8):2150048, July 2021. doi:10.1142/S0219493721500489.
$\bullet $ Sebastian Becker, Patrick Cheridito, Arnulf Jentzen, and Timo Welti. Solving high-dimensional optimal stopping problems using deep learning. European J. Appl. Math., 32:470–514, June 2021. doi:10.1017/S0956792521000073.
$\bullet $ Patrick Cheridito, Arnulf Jentzen, and Florian Rossmannek. Non-convergence of stochastic gradient descent in the training of deep neural networks. J. Complexity, 64:101540, June 2021. doi:10.1016/j.jco.2020.101540.
$\bullet $ Arnulf Jentzen and Ryan Kurniawan. Weak convergence rates for Euler-type approximations of semilinear stochastic evolution equations with nonlinear diffusion coefficients. Found. Comput. Math., 21(2):445–536, May 2021. doi:10.1007/s10208-020-09448-x.
$\bullet $ Dennis Elbrächter, Philipp Grohs, Arnulf Jentzen, and Christoph Schwab. DNN expression rate analysis of high-dimensional PDEs: Application to option pricing. Constr. Approx., 55:3–71, May 2021. doi:10.1007/s00365-021-09541-6.
$\bullet $ Christian Beck, Martin Hutzenthaler, Arnulf Jentzen, and Emilia Magnani. Full history recursive multilevel picard approximations for ordinary differential equations with expectations. arXiv e-prints, March 2021. arXiv:2103.02350.
$\bullet $ Arnulf Jentzen and Timo Kröger. Convergence rates for gradient descent in the training of overparameterized artificial neural networks with biases. arXiv e-prints, February 2021. arXiv:2102.11840.
$\bullet $ Sonja Cox, Martin Hutzenthaler, and Arnulf Jentzen. Local Lipschitz continuity in the initial value and strong completeness for nonlinear stochastic differential equations. arXiv e-prints, February 2021. arXiv:1309.5595.
$\bullet $ Adam Andersson, Arnulf Jentzen, and Ryan Kurniawan. Existence, uniqueness, and regularity for stochastic evolution equations with irregular initial values. J. Math. Anal. Appl., 495(1):124558, 33, January 2021. doi:10.1016/j.jmaa.2020.124558.
$\bullet $ Arnulf Jentzen, Benno Kuckuck, Ariel Neufeld, and Philippe von Wurstemberger. Strong error analysis for stochastic gradient descent optimization algorithms. IMA J. Numer. Anal., 41(1):455–492, January 2021. doi:10.1093/imanum/drz055.
$\bullet $ Sonja Cox, Martin Hutzenthaler, Arnulf Jentzen, Jan van Neerven, and Timo Welti. Convergence in Hölder norms with applications to Monte Carlo methods in infinite dimensions. IMA J. Numer. Anal., 41(1):493–548, January 2021. doi:10.1093/imanum/drz063.
$\bullet $ Arnulf Jentzen, Felix Lindner, and Primož Pušnik. Exponential moment bounds and strong convergence rates for tamed-truncated numerical approximations of stochastic convolutions. Numer. Algorithms, 85(4):1447–1473, December 2020. doi:10.1007/s11075-019-00871-y.
$\bullet $ Christian Beck, Fabian Hornung, Martin Hutzenthaler, Arnulf Jentzen, and Thomas Kruse. Overcoming the curse of dimensionality in the numerical approximation of Allen-Cahn partial differential equations via truncated full-history recursive multilevel Picard approximations. J. Numer. Math., 28(4):197–222, December 2020. doi:10.1515/jnma-2019-0074.
$\bullet $ Pierfrancesco Beneventano, Patrick Cheridito, Arnulf Jentzen, and Philippe von Wurstemberger. High-dimensional approximation spaces of artificial neural networks and applications to partial differential equations. arXiv e-prints, December 2020. arXiv:2012.04326.
$\bullet $ Martin Hutzenthaler, Arnulf Jentzen, Thomas Kruse, Tuan Anh Nguyen, and Philippe von Wurstemberger. Overcoming the curse of dimensionality in the numerical approximation of semilinear parabolic partial differential equations. Proc. A., 476(2244):20190630, 25, December 2020. doi:10.1098/rspa.2019.0630.
$\bullet $ Christian Beck, Sebastian Becker, Patrick Cheridito, Arnulf Jentzen, and Ariel Neufeld. Deep learning based numerical approximation algorithms for stochastic partial differential equations and high-dimensional nonlinear filtering problems. arXiv e-prints, December 2020. arXiv:2012.01194.
$\bullet $ Sebastian Becker, Ramon Braunwarth, Martin Hutzenthaler, Arnulf Jentzen, and Philippe von Wurstemberger. Numerical simulations for full history recursive multilevel Picard approximations for systems of high-dimensional partial differential equations. Commun. Comput. Phys., 28(5):2109–2138, November 2020. doi:10.4208/cicp.OA-2020-0130.
$\bullet $ Christian Beck, Arnulf Jentzen, and Thomas Kruse. Nonlinear Monte Carlo methods with polynomial runtime for high-dimensional iterated nested expectations. arXiv e-prints, September 2020. arXiv:2009.13989.
$\bullet $ Martin Hutzenthaler, Arnulf Jentzen, Thomas Kruse, and Tuan Anh Nguyen. Multilevel Picard approximations for high-dimensional semilinear second-order PDEs with Lipschitz nonlinearities. arXiv e-prints, September 2020. arXiv:2009.02484.
$\bullet $ Martin Hutzenthaler, Arnulf Jentzen, and Philippe von Wurstemberger. Overcoming the curse of dimensionality in the approximative pricing of financial derivatives with default risks. Electron. J. Probab., 25:Paper No. 101, 73, August 2020. doi:10.1214/20-ejp423.
$\bullet $ Aritz Bercher, Lukas Gonon, Arnulf Jentzen, and Diyora Salimova. Weak error analysis for stochastic gradient descent optimization algorithms. arXiv e-prints, July 2020. arXiv:2007.02723.
$\bullet $ Julius Berner, Philipp Grohs, and Arnulf Jentzen. Analysis of the generalization error: empirical risk minimization over deep artificial neural networks overcomes the curse of dimensionality in the numerical approximation of Black-Scholes partial differential equations. SIAM J. Math. Data Sci., 2(3):631–657, July 2020. doi:10.1137/19M125649X.
$\bullet $ Sebastian Becker, Patrick Cheridito, and Arnulf Jentzen. Pricing and heding american-style options with deep learning. Journal of Risk and Financial Management, July 2020. doi:10.3390/jrfm13070158.
$\bullet $ Benjamin Fehrman, Benjamin Gess, and Arnulf Jentzen. Convergence rates for the Stochastic Gradient Descent method for non-convex objective functions. J. Mach. Learn. Res., 21(136):1–48, June 2020. doi:10.1016/j.jco.2019.101438.
$\bullet $ Sebastian Becker, Benjamin Gess, Arnulf Jentzen, and Peter E. Kloeden. Lower and upper bounds for strong approximation errors for numerical approximations of stochastic heat equations. BIT, 60(4):1057–1073, June 2020. doi:10.1007/s10543-020-00807-2.
$\bullet $ Martin Hutzenthaler, Arnulf Jentzen, Thomas Kruse, and Tuan Anh Nguyen. A proof that rectified deep neural networks overcome the curse of dimensionality in the numerical approximation of semilinear heat equations. Partial Differ. Equ. and Appl., 1(2):10, April 2020. doi:10.1007/s42985-019-0006-9.
$\bullet $ Arnulf Jentzen and Primož Pušnik. Strong convergence rates for an explicit numerical approximation method for stochastic evolution equations with non-globally Lipschitz continuous nonlinearities. IMA J. Numer. Anal., 40(2):1005–1050, April 2020. doi:10.1093/imanum/drz009.
$\bullet $ Martin Hutzenthaler and Arnulf Jentzen. On a perturbation theory and on strong convergence rates for stochastic ordinary and partial differential equations with nonglobally monotone coefficients. Ann. Probab., 48(1):53–93, March 2020. doi:10.1214/19-AOP1345.
$\bullet $ Christian Beck, Lukas Gonon, and Arnulf Jentzen. Overcoming the curse of dimensionality in the numerical approximation of high-dimensional semilinear elliptic partial differential equations. arXiv e-prints, March 2020. arXiv:2003.00596v1.
$\bullet $ Martin Hutzenthaler, Arnulf Jentzen, Felix Lindner, and Primož Pušnik. Strong convergence rates on the whole probability space for space-time discrete numerical approximation schemes for stochastic Burgers equations. arXiv e-prints, November 2019. arXiv:1911.01870.
$\bullet $ Michael B. Giles, Arnulf Jentzen, and Timo Welti. Generalised multilevel Picard approximations. arXiv e-prints, November 2019. arXiv:1911.03188.
$\bullet $ Arnulf Jentzen, Uli Stadtmüller, and Robert Stelzer. Foreword: Special issue on “Stochastic differential equations, stochastic algorithms, and applications”. J. Math. Anal. Appl., 476(1):1, August 2019. doi:10.1016/j.jmaa.2019.03.069.
$\bullet $ Christian Beck, Weinan E, and Arnulf Jentzen. Machine learning approximation algorithms for high-dimensional fully nonlinear partial differential equations and second-order backward stochastic differential equations. J. Nonlinear Sci., 29:1563–1619, August 2019. doi:10.1007/s00332-018-9525-3.
$\bullet $ Daniel Conus, Arnulf Jentzen, and Ryan Kurniawan. Weak convergence rates of spectral Galerkin approximations for SPDEs with nonlinear diffusion coefficients. Ann. Appl. Probab., 29(2):653–716, August 2019. doi:10.1214/17-AAP1352.
$\bullet $ Arnulf Jentzen, Felix Lindner, and Primož Pušnik. On the Alekseev-Gröbner formula in Banach spaces. Discrete Contin. Dyn. Syst. Ser. B, 24(8):4475–4511, August 2019. doi:10.3934/dcdsb.2019128.
$\bullet $ Weinan E, Martin Hutzenthaler, Arnulf Jentzen, and Thomas Kruse. On multilevel Picard numerical approximations for high-dimensional nonlinear parabolic partial differential equations and high-dimensional nonlinear backward stochastic differential equations. J. Sci. Comput., 79(3):1534–1571, June 2019. doi:10.1007/s10915-018-00903-0.
$\bullet $ Giuseppe Da Prato, Arnulf Jentzen, and Michael Röckner. A mild Itô formula for SPDEs. Trans. Amer. Math. Soc., 372:3755–3807, June 2019. doi:10.1090/tran/7165.
$\bullet $ Adam Andersson, Mario Hefter, Arnulf Jentzen, and Ryan Kurniawan. Regularity properties for solutions of infinite dimensional Kolmogorov equations in Hilbert spaces. Potential Anal., 50:347–379, May 2019. doi:10.1007/s11118-018-9685-7.
$\bullet $ Sebastian Becker, Patrick Cheridito, and Arnulf Jentzen. Deep optimal stopping. J. Mach. Learn. Res., 20(74):1–25, April 2019. doi:10.5555/3322706.3362015.
$\bullet $ Arnulf Jentzen, Diyora Salimova, and Timo Welti. Strong convergence for explicit space-time discrete numerical approximation methods for stochastic Burgers equations. J. Math. Anal. App., 469:661–704, January 2019. doi:10.1016/j.jmaa.2018.09.032.
$\bullet $ Sebastian Becker and Arnulf Jentzen. Strong convergence rates for nonlinearity-truncated Euler-type approximations of stochastic Ginzburg-Landau equations. Stochastic Process. Appl., 129:28–69, January 2019. doi:10.1016/j.spa.2018.02.008.
$\bullet $ Mario Hefter and Arnulf Jentzen. On arbitrarily slow convergence rates for strong numerical approximations of Cox-Ingersoll-Ross processes and squared Bessel processes. Finance Stoch., 23:139–172, January 2019. doi:10.1007/s00780-018-0375-5.