This function parallel transports a vector along a geodesic (where uniqueness is determined) by the logarithmic map implementation) by
This formula is taken from [1,2] and can be interpreted as follows: all components of that share no part with the direction are left unchanged. For the remaining part (second term) we have to perform a correction.
Matlab Documentation
1
2
3
4
5
6
7
8
9
10
11
% eta = parallelTransport(x,y,xi) transport xi from TxM parallel to TyM
%
% INPUT
% x,y : two (sets of) points on the manifold
% xi : a (set of) vectors from TxM
%
% OUTPUT
% eta : the parallel transported vectors in TyM
% 
% Manifoldvalued Image Restoration Toolbox 1.2
% R. Bergmann  20180301
See also
References

Bergmann, R, Fitschen, J H, Persch, J and Steidl, G (2017). Priors with coupled first and second order differences for manifoldvalued image processing
We generalize discrete variational models involving the infimal convolution (IC) of first and second order differences and the total generalized variation (TGV) to manifoldvalued images. We propose both extrinsic and intrinsic approaches. The extrinsic models are based on embedding the manifold into an Euclidean space of higher dimension with manifold constraints. An alternating direction methods of multipliers can be employed for finding the minimizers. However, the components within the extrinsic IC or TGV decompositions live in the embedding space which makes their interpretation difficult. Therefore we investigate two intrinsic approaches: for Lie groups, we employ the group action within the models; for more general manifolds our IC model is based on recently developed absolute second order differences on manifolds, while our TGV approach uses an approximation of the parallel transport by the pole ladder. For computing the minimizers of the intrinsic models we apply gradient descent algorithms. Numerical examples demonstrate that our approaches work well for certain manifolds.@online{BFPS17, author = {Bergmann, Ronny and Fitschen, Jan Henrik and Persch, Johannes and Steidl, Gabriele}, title = {Priors with coupled first and second order differences for manifoldvalued image processing}, year = {2017}, eprint = {1709.01343}, eprinttype = {arXiv} }

Hosseini, S and Uschmajew, A (2017). A Riemannian gradient sampling algorithm for nonsmooth optimization on manifolds. SIAM Journal on Optimization. 27 173–89
In this paper, an optimization method for nonsmooth locally Lipschitz functions on complete Riemannian manifolds is presented. The method is based on approximating the subdifferential of the cost function at every iteration by the convex hull of transported gradients from tangent spaces at randomly generated nearby points to the tangent space of the current iterate and can hence be seen as a generalization of the well known gradient sampling algorithm to a Riemannian setting. A convergence result is obtained under the assumption that the cost function is bounded below and continuously differentiable on an open set of full measure and that the employed vector transport and retraction satisfy certain conditions, which hold, for instance, for the exponential map and parallel transport. Then with probability one the algorithm produces iterates at which the cost function is differentiable, and each cluster point of the iterates is a Clarke stationary point. Modifications yielding only \(\varepsilon\)stationary points are also possible.@article{HU17, author = {Hosseini, Seyedehsomayeh and Uschmajew, André}, title = {A {R}iemannian gradient sampling algorithm for nonsmooth optimization on manifolds}, journal = {SIAM Journal on Optimization}, volume = {27}, number = {1}, pages = {173189}, year = {2017}, doi = {10.1137/16M1069298} }