Mathematik und Informatik

Prof. Dr. Arnulf Jentzen, Angewandte Mathematik Münster: Institut für Analysis und Numerik

Member of Mathematics Münster
Investigator in Mathematics Münster
Field of expertise: Numerical analysis, machine learning, scientific computing

Private Homepagehttps://www.uni-muenster.de/AMM/Jentzen/Mitarbeiter/included.shtml
Research InterestsMathematics for machine learning
Numerical approximations for high-dimensional partial differential equations
Numerical approximations for stochastic differential equations
Deep Learning
Selected PublicationsHairer M, Hutzenthaler M, Jentzen A Loss of regularity for Kolmogorov equations. Annals of Probability Vol. 43 (2), 2015, pp 468-527 online
Hutzenthaler M, Jentzen A On a perturbation theory and on strong convergence rates for stochastic ordinary and partial differential equations with nonglobally monotone coefficients. Annals of Probability Vol. 48 (1), 2020, pp 53-93 online
Hutzenthaler M, Jentzen A, Kloeden PE Strong and weak divergence in finite time of Euler's method for stochastic differential equations with non-globally Lipschitz continuous coefficients. Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences Vol. 467 (2130), 2011, pp 1563-1576 online
Fehrman B, Gess B, Jentzen A Convergence rates for the stochastic gradient descent method for non-convex objective functions. Journal of Machine Learning Research Vol. 21, 2020, pp Paper No. 136, 48 online
Han J, Jentzen A, E W Solving high-dimensional partial differential equations using deep learning. Proceedings of the National Academy of Sciences of the United States of America Vol. 115 (34), 2018, pp 8505-8510 online
E W, Han J, Jentzen A Deep learning-based numerical methods for high-dimensional parabolic partial differential equations and backward stochastic differential equations. Communications in Mathematics and Statistics Vol. 5 (4), 2017, pp 349-380 online
Hutzenthaler M, Jentzen A, Kloeden PE Strong convergence of an explicit numerical method for SDEs with nonglobally Lipschitz continuous coefficients. Annals of Applied Probability Vol. 22 (4), 2012, pp 1611-1641 online
Hutzenthaler M, Jentzen A Numerical approximations of stochastic differential equations with non-globally Lipschitz continuous coefficients. Memoirs of the American Mathematical Society Vol. 236 (1112), 2015, pp v+99 online
Hutzenthaler M, Jentzen A, Kruse T, Nguyen TA, von Wurstemberger P Overcoming the curse of dimensionality in the numerical approximation of semilinear parabolic partial differential equations. Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences Vol. 476 (2244), 2020, pp 630-654 online
Selected ProjectsMathematical Theory for Deep Learning It is the key goal of this project to provide a rigorous mathematical analysis for deep learning algorithms and thereby to establish mathematical theorems which explain the success and the limitations of deep learning algorithms. In particular, this projects aims (i) to provide a mathematical theory for high-dimensional approximation capacities for deep neural networks, (ii) to reveal suitable regular sequences of functions which can be approximated by deep neural networks but not by shallow neural networks without the curse of dimensionality, and (iii) to establish dimension independent convergence rates for stochastic gradient descent optimization algorithms when employed to train deep neural networks with error constants which grow at most polynomially in the dimension. online
Overcoming the curse of dimensionality: stochastic algorithms for high-dimensional partial differential equations Partial differential equations (PDEs) are among the most universal tools used in modeling problems in nature and man-made complex systems. The PDEs appearing in applications are often high dimensional. Such PDEs can typically not be solved explicitly and developing efficient numerical algorithms for high dimensional PDEs is one of the most challenging tasks in applied mathematics. As is well-known, the difficulty lies in the so-called ''curse of dimensionality'' in the sense that the computational effort of standard approximation algorithms grows exponentially in the dimension of the considered PDE. It is the key objective of this research project to overcome this curse of dimensionality and to construct and analyze new approximation algorithms which solve high dimensional PDEs with a computational effffort that grows at most polynomially in both the dimension of the PDE and the reciprocal of the prescribed approximation precision. online
Existence, uniqueness, and regularity properties of solutions of partial differential equations The goal of this project is to reveal existence, uniqueness, and regularity properties of solutions of partial differential equations (PDEs). In particular, we intend to study existence, uniqueness, and regularity properties of viscosity solutions of degenerate semilinear Kolmogorov PDEs of the parabolic type. We plan to investigate such PDEs by means of probabilistic representations of the Feynman-Kac type. We also intend to study the connections of such PDEs to optimal control problems. online
Regularity properties and approximations for stochastic ordinary and partial differential equations with non-globally Lipschitz continuous nonlinearities A number of stochastic ordinary and partial differential equations from the literature (such as, for example, the Heston and the 3/2-model from financial engineering, (overdamped) Langevin-type equations from molecular dynamics, stochastic spatially extended FitzHugh-Nagumo systems from neurobiology, stochastic Navier-Stokes equations, Cahn-Hilliard-Cook equations) contain non-globally Lipschitz continuous nonlinearities in their drift or diffusion coefficients. A central aim of this project is to investigate regularity properties with respect to the initial values of such stochastic differential equations in a systematic way. A further goal of this project is to analyze the regularity of solutions of the deterministic Kolmogorov partial dfferential equations associated to such stochastic differential equations. Another aim of this project is to analyze weak and strong convergence and convergence rates of numerical approximations for such stochastic differential equations. online
Topics in
Mathematics Münster


T7: Field theory and randomness
T9: Multi-scale processes and effective behaviour
T10: Deep learning and surrogate methods
Current PublicationsJentzen A, Riekert A Strong overall error analysis for the training of artificial neural networks via random initializations. Communications in Mathematics and Statistics Vol. 0, 2023 online
Beck C, Hutzenthaler M, Jentzen A, Kuckuck B An overview on deep learning-based approximation methods for partial differential equations. Discrete and Continuous Dynamical Systems - Series B Vol. 28 (6), 2023 online
Jentzen A, Riekert A Convergence analysis for gradient flows in the training of artificial neural networks with ReLU activation. Journal of Mathematical Analysis and Applications Vol. 517 (2), 2023 online
Hutzenthaler M, Jentzen A, Kruse T, Nguyen TA Overcoming the curse of dimensionality in the numerical approximation of backward stochastic differential equations. Journal of Numerical Mathematics Vol. 31 (2), 2023 online
Grohs P, Ibragimov S, Jentzen A, Koppensteiner S Lower bounds for artificial neural network approximations: A proof that shallow neural networks fail to overcome the curse of dimensionality. Journal of Complexity Vol. 0, 2023 online
Grohs P, Hornung F, Jentzen A, Zimmermann P Space-time error estimates for deep neural network approximations for differential equations. Advances in Computational Mathematics Vol. 49 (1), 2023 online
Becker S, Gess B, Jentzen A, Kloeden PE Strong convergence rates for explicit space-time discrete numerical approximations of stochastic Allen--Cahn equations. Stochastics and Partial Differential Equations: Analysis and Computations Vol. 11 (1), 2023 online
Gonon, L.; Graeber, R.; Jentzen, A. The necessity of depth for artificial neural networks to approximate certain classes of smooth bounded functions without the curse of dimensionality. , 2023 online
Jentzen, A.; Riekert, A.; von Wurstemberger, P. Algorithmically Designed Artificial Neural Networks (ADANNs): Higher order deep operator learning for parametric partial differential equations. , 2023 online
Current ProjectsOvercoming the curse of dimensionality through nonlinear stochastic algorithms: Nonlinear Monte Carlo type methods for high-dimensional approximation problems

In many relevant real-world problems it is of fundamental importance to approximately compute evaluations of high-dimensional functions. Standard deterministic approximation methods often suffer in this context from the so-called curse of dimensionality in the sense that the number of computational operations of the approximation method grows at least exponentially in the problem dimension. It is the key objective of the ERC-funded MONTECARLO project to employ multilevel Monte Carlo and stochastic gradient descent type methods to design and analyse algorithms which provably overcome the curse of dimensionality in the numerical approximation of several high-dimensional functions; these include solutions of certain stochastic optimal control problems of some nonlinear partial differential equations and of certain supervised learning problems.

online
Dynamical systems and irregular gradient flows The central goal of this project is to study asymptotic properties for gradient flows (GFs) and related dynamical systems. In particular, we intend to establish a priori bounds and related regularity properties for solutions of GFs, we intend to study the behaviour of GFs near unstable critical regions, we intend to derive lower and upper bounds for attracting regions, and we intend to establish convergence speeds towards global attrators. Special attention will be given to GFs with irregularities (discontinuities) in the gradient and for such GFs we also intend to reveal sufficient conditions for existence, uniqueness, and flow properties in dependence of the given potential. We intend to accomplish the above goals by extending techniques and concepts from differential geometry to describe and study attracting and critical regions, by using tools from convex analysis such as subdifferentials and other generalized derivatives, as well as by employing concepts from real algebraic geometry to describe domains of attraction. In particular, we intend to generalize the center-stable manifold theorem from the theory of dynamical systems to the considered non-smooth setting. Beside finite dimensional GFs, we also study GFs in their associated infinite dimensional limits. The considered irregular GFs and related dynamical systems naturally arise, for example, in the context of molecular dynamics (to model the configuration of atoms along temporal evoluation) and machine learning (to model the training process of artificial neural networks).
online
EXC 2044 - C1: Evolution and asymptotics In this unit, we will use generalisations of optimal transport metrics to develop gradient flow descriptions of (cross)-diffusion-reaction systems, rigorously analyse their pattern forming properties, and develop corresponding efficient numerical schemes. Related transport-type- and hyperbolic systems will be compared with respect to their pattern-forming behaviour, especially when mass is conserved. Bifurcations and the effects of noise perturbations will be explored.

Moreover, we aim to understand defect structures, their stability and their interactions. Examples are the evolution of fractures in brittle materials and of vortices in fluids. Our analysis will explore the underlying geometry of defect dynamics such as gradient descents or Hamiltonian structures. Also, we will further develop continuum mechanics and asymptotic descriptions for multiple bodies which deform, divide, move, and dynamically attach to each other in order to better describe the bio-mechanics of growing and dividing soft tissues.

Finally, we are interested in the asymptotic analysis of various random structures as the size or the dimension of the structure goes to infinity. More specifically, we shall consider random polytopes and random trees.For random polytopes we would like to compute the expected number of faces in all dimensions, the expected (intrinsic) volume, and absorption probabilities, as well as higher moments and limit distributions for these quantities. online
EXC 2044 - C3: Interacting particle systems and phase transitions The question of whether a system undergoes phase transitions and what the critical parameters are is intrinsically related to the structure and geometry of the underlying space. We will study such phase transitions for variational models, for processes in random environments, for interacting particle systems, and for complex networks. Of special interest are the combined effects of fine-scalerandomly distributed heterogeneities and small gradient perturbations.

We aim to connect different existing variational formulations for transportation networks, image segmentation, and fracture mechanics and explore the resulting implications on modelling, analysis, and numerical simulation of such processes. We will study various aspects of complex networks, i.e. sequences of random graphs (Gn)n∈N, asking for limit theorems as n tends to infinity. A main task will be to broaden the class of networks that can be investigated, in particular, models which include geometry and evolve in time. We will study Ising models on random networks or with random interactions, i.e. spin glasses. Fluctuations of order parameters and free energies will be analysed, especially at the critical values where the system undergoes a phase transition. We will also investigate whether a new class of interacting quantum fields connected with random matrices and non-commutative geometry satisfies the Osterwalder-Schrader axioms. Further, we will study condensation phenomena, where complex network models combine the preferential attachment paradigm with the concept of fitness. In the condensation regime, a certain fraction of the total mass dynamically accumulates at one point, the condensate. The aim is a qualitative and quantitative analysis of the condensation. We willalso explore connections to structured population models. Further, we will study interacting particle systems on graphs that describe social interaction or information exchange. Examples are the averaging process or the Deffuant model.

We will also analyse asymmetric exclusion processes (ASEP) on arbitrary network structures. An interesting aspect will be how these processes are influenced by different distribution mechanisms of the particles at networks nodes. If the graph is given by a lattice, we aim to derive hydrodynamic limits for the ASEP with jumps of different ranges for multiple species, and for stochastic interactingmany-particle models of reinforced random walks. Formally, local cross-diffusion syste ms are obtained as limits of the classical multi-species ASEP and of the many-particle random walk. We will compare the newly resulting limiting equations and are interested in fluctuations, pattern formation, and the long-time behaviour of these models on the microscopic and the macroscopic scale. Further, we will analyse properties of the continuous directed polymer in a random environment. online
E-Mailajentzen@uni-muenster.de
Phone+49 251 83-33793
FAX+49 251 83-32729
Room120.005
Secretary   Sekretariat Claudia Giesbert
Frau Claudia Giesbert
Telefon +49 251 83-33792
Fax +49 251 83-32729
Zimmer 120.002
AddressProf. Dr. Arnulf Jentzen
Angewandte Mathematik Münster: Institut für Analysis und Numerik
Fachbereich Mathematik und Informatik der Universität Münster
Orléans-Ring 10
48149 Münster
Diese Seite editieren