Dr. Somayeh Hosseini (Uni Bonn): A Riemannian gradient sampling algorithm for nonsmooth optimization on manifolds.
Wednesday, 19.10.2016 14:00 im Raum M6
In this talk, an optimization method for nonsmooth locally
Lipschitz functions on
Riemannian manifolds will be presented. The method is based on
approximating the subdifferential of
the cost function at every iteration by the convex hull of
transported gradients from tangent spaces
at randomly generated nearby points to the tangent space of the
current iterate, and can hence
be seen a generalization of the well-known gradient sampling
algorithm to a Riemannian setting.
A convergence result will be obtained under the assumption that the
cost function is continuously
differentiable on an open set of full measure, and that the
employed vector transport and retraction
satisfy certain conditions, which hold for instance for the
exponential map and parallel transport.
Then with probability one the algorithm is feasible, and either the
sequence of function values
associated with the constructed iterates is unbounded from below,
or each cluster point of the
iterates is a Clarke stationary point.
Angelegt am 17.10.2016 von Elke Enning
Geändert am 17.10.2016 von Julia Moudden
[Edit | Vorlage]