Research Interests

Research Interests

Complex networks and non-standard growth models.
Stochastic algorithms, numerical analysis and complexity.
Stochastic analysis.

Recent Publications

Recent Publications of Prof. Dr. Steffen Dereich

Steffen Dereich, Arnulf Jentzen, and Adrian Riekert. Averaged Adam accelerates stochastic optimization in the training of deep neural network approximations for partial differential equation and optimal control problems. arXiv e-prints, January 2025.

Steffen Dereich and Arnulf Jentzen. Convergence rates for the Adam optimizer. arXiv e-prints, July 2024. arXiv:2407.21078.

Steffen Dereich, Robin Graeber, and Arnulf Jentzen. Non-convergence of Adam and other adaptive stochastic gradient descent optimization methods for non-vanishing learning rates. arXiv e-prints, July 2024. arXiv:2407.08100.

Steffen Dereich and Sebastian Kassing. Convergence of stochastic gradient descent schemes for Łojasiewicz-landscapes. Journal of Machine Learning, 3(3):245–281, June 2024. doi:10.4208/jml.240109.

Steffen Dereich, Arnulf Jentzen, and Adrian Riekert. Learning rate adaptive stochastic gradient descent optimization methods: numerical simulations for deep learning methods for partial differential equations and convergence analyses. arXiv e-prints, June 2024. arXiv:2406.14340.

Steffen Dereich and Sebastian Kassing. On the existence of optimal shallow feedforward networks with ReLU activation. Journal of Machine Learning, 3(1):1–22, January 2024. doi:10.4208/jml.230903.

Steffen Dereich and Sebastian Kassing. On the existence of optimal shallow feedforward networks with ReLU activation. arXiv e-prints, March 2023. arXiv:2303.03950.

Steffen Dereich, Arnulf Jentzen, and Sebastian Kassing. On the existence of minimizers in shallow residual ReLU neural network optimization landscapes. arXiv e-prints, February 2023. arXiv:2302.14690.

Steffen Dereich and Sebastian Kassing. Central limit theorems for stochastic gradient descent with averaging for stable manifolds. Electronic Journal of Probability, 28:1–48, January 2023. doi:10.1214/23-EJP947.

Steffen Dereich and Sebastian Kassing. Cooling down stochastic differential equations: Almost sure convergence. Stoch. Process. their Appl., 152:289–311, October 2022. doi:10.1016/j.spa.2022.06.020.

Steffen Dereich and Sebastian Kassing. On minimal representations of shallow ReLU networks. Neural Networks, 148:121–128, April 2022. doi:10.1016/j.neunet.2022.01.006.

Steffen Dereich and Martin Maiwald. Quasi-processes for branching Markov chains. arXiv e-prints, July 2021. arXiv:2107.06654.

Steffen Dereich and Sebastian Kassing. Convergence of stochastic gradient descent schemes for Lojasiewicz-landscapes. arXiv e-prints, February 2021. arXiv:2102.09385.

Steffen Dereich. General multilevel adaptations for stochastic approximation algorithms II: CLTs. Stochastic Process. Appl., 132:226–260, February 2021. doi:10.1016/j.spa.2020.11.001.

Steffen Dereich and Sebastian Kassing. Central limit theorems for stochastic gradient descent with averaging for stable manifolds. arXiv e-prints, December 2019. arXiv:1912.09187.

Steffen Dereich. The rank-one and the preferential attachment paradigm. In Network Science, pages 43–58. November 2019. doi:10.1007/978-3-030-26814-5_4.

Steffen Dereich and Thomas Müller-Gronbach. General multilevel adaptations for stochastic approximation algorithms of Robbins–Monro and Polyak–Ruppert type. Numer. Math., 142(2):279–328, June 2019. doi:10.1007/s00211-019-01024-y.