Determines a step size such that
where is a descent direction, e.g. . where the search is performed by , and is a constant. For further details see Definition 4.2.2 in [1]
Optional Parameters
Gradient
() if this value is not given, we assume that is the negative gradient. Otherwise you have to spezify a gradient here
InitialStepSize
: () initial step size as starting point for the line search
rho
(0.5) decrease exponent for line search, i.e. we update
c
() constant in front of the inner product.
Matlab Documentation
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
% stepSizeAsrmijo(M,F,x,descentDir) compute the step size by Armijo's rule
%
% INPUT
% M : a manifold
% F : a functional @(x)
% x : the current point
% descentDir : a descent direction
%
% OPTIONAL
% Gradient : (descentDir) gradient direction (if descentDir is not grad)
% InitialStepSize : (1) initial step size as starting point for search
% rho : (0.5) decrease factor x_next = (x)^rho
% c : (0.0001) the factor in front of the norm
% 
% MVIRT  R. Bergmann  20180315
References

Absil, PA, Mahony, R and Sepulchre, R (2008). Optimization Algorithms on Matrix Manifolds. Princeton University Press, Princeton and Oxford
Many problems in the sciences and engineering can be rephrased as optimization problems on matrix search spaces endowed with a socalled manifold structure. This book shows how to exploit the special structure of such problems to develop efficient numerical algorithms. It places careful emphasis on both the numerical formulation of the algorithm and its differential geometric abstractionâ€“illustrating how good algorithms draw equally from the insights of differential geometry, optimization, and numerical analysis. Two more theoretical chapters provide readers with the background in differential geometry necessary to algorithmic development. In the other chapters, several wellknown optimization methods such as steepest descent and conjugate gradients are generalized to abstract manifolds. The book provides a generic development of each of these methods, building upon the material of the geometric chapters. It then guides readers through the calculations that turn these geometrically formulated methods into concrete numerical algorithms. The stateoftheart algorithms given as examples are competitive with the best existing algorithms for a selection of eigenspace problems in numerical linear algebra.@book{AMS08, title = {Optimization Algorithms on Matrix Manifolds}, author = {Absil, P.A. and Mahony, R. and Sepulchre, R.}, publisher = {Princeton University Press}, address = {Princeton and Oxford}, year = {2008}, isbn = {9780691132983} }