This algorithm computes the cyclic or randomized proximal point algorithm on a manifold for given input data , where denotes the data domain —usually a signal or pixel grid, but necessarily on some array form—, a set of proximal maps, and a stopping criterion.
Given a function this algorithm aims to find a minimizer
so in a sense it is similar to a gradient descent method, but it is also applicable if is not differentiable and there is no need to determine a step size; choosing the described in the following determine merely the speed of convergence.
If we know for , that there exists a decomposition of into atoms
where each of the is assumed to have a proximal map, that is either known in closed form or otherwise easily (approximately to be computed, i.e.
where we denote the inner (one cycle) iterations by a fractional index.
M denotes a
manifold class and
x denotes the
initial data both remaining parameters are functionals. The easier one is the
stoppingCriterion = @(x,xold,lambda,iter) with the for parameters , , and the current iteration .
The itertion is running while the functional returns false, i.e. the stoppingCriterion is not fulfilled and stops after a cycle if it returns true.
The easiest is of course just depending on the iteration:
@(x,xold,lambda,iter) iter>1999 will stop after 2000 iterations.
1 fi = @(x,lambda) ...
evaluating the corresponding proximal map of . For a list, see below
- a functional
@(iter)returning a value for
- easiest way to set the lambda iterate: defines the
- if not specified, the above cyclic order is used. Other possibilities are
'permute'which permutes the order of the proximal maps in each cycle randomly or
'random'which choses one proximal map at random per iteration to be evaluated.
- If set to an anonymous functional depending on
@(x,xold,iter)this is called in each cycle, for example to produce text output or save iteration data to a file.
- If set to a functional
@(x,iter)returning a column vector, and calling the
cyclicProximalPointwith a second return value, this function records these column vectors in a matrix returned in the second return value
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 % CyclicProximalPoint(M,f,proximalMaps,stoppingCriterion) % Compute the cyclic proximal point algorithm starting with data f, % cyclic through the proximal points % Input % M : a manifold % x : given (initial) data % proximalMaps : cell arrays of functionals computing the proximal % maps, i.e. can be called as prox = @(x,lambda) % stoppingCriterion : a functional @(x,xold,lambda,iter) as a stopping % criterion based on x, the last iterate xold, % lambda, % that indicates, then to stop, i.e., is 1 for stops. % % OPTIONAL % 'lambdaIterate' : (@ (iter)) a functional @(iter) computing the % iterates value of lambda % 'lambda' : if no iterate (see above) is given, this value is used % to set the iterate to @(iter) lambda/iter % 'Order' : ('') the standard order is ascending in proximalMaps, % 'random' takes a random proximal map each iteration % (note that then one iteration is only _one_ % proximal map evaluated) % 'permute' keeps a cyclic manner but permutes the % order for each cycle anew. % Debug : () a function @ (x,xold,iter) producing debug % output % Record : (@(x,iter)) a function returning a column vector, if % there's a second return value and this function is % given, data is recorded in an array and returned % % OUTPUT % y : result of the CPP algorithm % recData : recorded data array of the by the record-function % --- % Manifold-Valued Image Restoration Toolbox 1.0 % R. Bergmann ~ 2018-01-22 | 2018-03-04
- Tutorial on how to compute a median on a manifold
- The CPP for additive first and second order TV
- The CPP for Huber TV
- an easy soptting criterion creator
- Bertsekas, D P (2010). Incremental Gradient, Subgradient, and Proximal Methods for Convex Optimization: a Survey. Laboratory for Information and Decision Systems, MIT, Cambridge, MA
- Bačák, M (2014). Computing medians and means in Hadamard Spaces. SIAM Journal on Optimization. 24 1542–66