The manifold consists of all matrices such that for all vectors the inequality
holds. ^{1}
In this class we use the affine metric as for example described in [1], but there is also the alternative way of using the LogEuclidean metric [2].
For vizualization we can use the following approach. Since is symmetric, all eigenvalues are real valued; since is positive definite, all eigenvalues are (strictly) positive. This yields a vizualization for the cases and :
For we take the eigenvectors of scaled to unit length and denot the eigenvalues with . Assume further that the eigenvalues are sorten, i.e. . We can now interprete the eigenvectors as axis of an ellipse: as major axis and minor axis . An example is the signal from [3].
Note that the mid points of the ellipses are placed on a regular grid. This grid has no fixed size and should be set to the same value for images that are showing comparable data.
For one can do the similar approach with all three eigenvalues and their eigenvectors to visualize an ellipsoid. Furthermore this can be used for image, or even 3D/volumetric, data. The following is an atrificial data set from [4]
Here, the grid is also arbitrarily chosen, see last note. This construction does not yet yield a color. We empoloy the geometric anisotropy index as presented in [5].
Footnotes

While this page usually sets vectors in small letters and bold, matrices in capital letters and bold, we refrain from this notation to emphasize the matrices here being points on the manifold. This way we keep pages consistent, that are about general manifold theory ↩
Functions
 The orthonormal basis in a tangent space

1
% [Xi,k] = TpMONB(x,y) Compute an ONB in TpM and curvature
This function computes an ONB and correspoinding curvature coefficients, belonging to the parallel transportet orthonormal frame and diagonalizes the curvature tensor along . The set of vectors in a tangent space added as last dimension of the array and consists of vectors.
more Add (Rician or Gaussian white) noise

1
% addNoise(x,sigma) add (Rician or Gaussian) noise to data x
Adds Rician (standard) or Gaussian (set
'Distribution'
to'Gaussian'
) with mean zero (using the tangent space) to data. The inner product on symmetric positive definite matrices

1
% d = dist(x,y) compute the distance between x,y from P(n).
The distance on the symmetric positive definite matrices is given by
where denotes the matrix logarithm and is the Frobenius norm of the matrix.
 The inner product on symmetric positive definite matrices

1
% dot(x,xi,nu) inner product of two tangent vectors in TxP(m)
The affine invariant metric on the symmetric positive definite matrices is defined for and as
where is the trace of a matrix.
 The exponential map on symmetric positive definite matrices

1
% y = exp(x,xi) exponential map at x from P(n) towards xi in TxP(n)
The exponential map on the sphere is given for , by the formula
where denotes the matrix exponential.
 The logarithmic map on symmetric positive definite matrices

1
% xi = log(x,y) logarithmic map at the point(s) x of points(s) y
The logarithmic map on symmetric positive definite matrices is given by
where where denotes the matrix logarithm.
 The parallel transport on symmetric positive definite matrices

1
% eta = parallelTransport(x,y,xi) parallel transport xi along g(.,x,y)
This function parallel transports a vector along the unique geodesic . The transport is given by
See also
References

Sra, S and Hosseini, R (2015). Conic Geometric Optimization on the Manifold of Positive Definite Matrices. SIAM Journal on Optimization. 25 713–39
We develop geometric optimisation on the manifold of Hermitian positive definite (HPD) matrices. In particular, we consider optimising two types of cost functions: (i) geodesically convex (gconvex); and (ii) lognonexpansive (LN). Gconvex functions are nonconvex in the usual euclidean sense, but convex along the manifold and thus allow global optimisation. LN functions may fail to be even gconvex, but still remain globally optimisable due to their special structure. We develop theoretical tools to recognise and generate gconvex functions as well as cone theoretic fixedpoint optimisation algorithms. We illustrate our techniques by applying them to maximumlikelihood parameter estimation for elliptically contoured distributions (a rich class that substantially generalises the multivariate normal distribution). We compare our fixedpoint algorithms with sophisticated manifold optimisation methods and obtain notable speedups.@article{SH15, author = {Sra, Suvrit and Hosseini, Reshad}, title = {Conic Geometric Optimization on the Manifold of Positive Definite Matrices}, journal = {SIAM Journal on Optimization}, volume = {25}, number = {1}, pages = {713739}, year = {2015}, doi = {10.1137/140978168}, eprinttype = {arXiv}, eprint = {1312.1039} }

Arsigny, V, Fillard, P, Pennec, X and Ayache, N (2005). Fast and simple calculus on tensors in the LogEuclidean framework. International Conference on Medical Image Computing and ComputerAssisted
Intervention. 3749 115–22
Computations on tensors have become common with the use of DTMRI. But the classical Euclidean framework has many defects, and affineinvariant Riemannian metrics have been proposed to correct them. These metrics have excellent theoretical properties but lead to complex and slow algorithms. To remedy this limitation, we propose new metrics called LogEuclidean. They also have excellent theoretical properties and yield similar results in practice, but with much simpler and faster computations. Indeed, LogEuclidean computations are Euclidean computations in the domain of matrix logarithms. Theoretical aspects are presented and experimental results for multilinear interpolation and regularization of tensor fields are shown on synthetic and real DTI data.@inproceedings{AFPA05, author = {Arsigny, V. and Fillard, P. and Pennec, X. and Ayache, N.}, title = {Fast and simple calculus on tensors in the {L}og{E}uclidean framework}, booktitle = {International Conference on Medical Image Computing and ComputerAssisted Intervention}, year = {2005}, volume = {3749}, pages = {115122}, doi = {10.1007/11566465_15} }

Bergmann, R, Fitschen, J H, Persch, J and Steidl, G (2017). Priors with coupled first and second order differences for manifoldvalued image processing
We generalize discrete variational models involving the infimal convolution (IC) of first and second order differences and the total generalized variation (TGV) to manifoldvalued images. We propose both extrinsic and intrinsic approaches. The extrinsic models are based on embedding the manifold into an Euclidean space of higher dimension with manifold constraints. An alternating direction methods of multipliers can be employed for finding the minimizers. However, the components within the extrinsic IC or TGV decompositions live in the embedding space which makes their interpretation difficult. Therefore we investigate two intrinsic approaches: for Lie groups, we employ the group action within the models; for more general manifolds our IC model is based on recently developed absolute second order differences on manifolds, while our TGV approach uses an approximation of the parallel transport by the pole ladder. For computing the minimizers of the intrinsic models we apply gradient descent algorithms. Numerical examples demonstrate that our approaches work well for certain manifolds.@online{BFPS17, author = {Bergmann, Ronny and Fitschen, Jan Henrik and Persch, Johannes and Steidl, Gabriele}, title = {Priors with coupled first and second order differences for manifoldvalued image processing}, year = {2017}, eprint = {1709.01343}, eprinttype = {arXiv} }

Bergmann, R, Persch, J and Steidl, G (2016). A parallel Douglas–Rachford algorithm for minimizing ROFlike functionals on images with values in symmetric Hadamard manifolds. SIAM Journal on Imaging Sciences. 9 901–37
We are interested in restoring images having values in a symmetric Hadamard manifold by minimizing a functional with a quadratic data term and a total variation like regularizing term. To solve the convex minimization problem, we extend the DouglasRachford algorithm and its parallel version to symmetric Hadamard manifolds. The core of the DouglasRachford algorithm are reflections of the functions involved in the functional to be minimized. In the Euclidean setting the reflections of convex lower semicontinuous functions are nonexpansive. As a consequence, convergence results for KrasnoselskiMann iterations imply the convergence of the DouglasRachford algorithm. Unfortunately, this general results does not carry over to Hadamard manifolds, where proper convex lower semicontinuous functions can have expansive reflections. However, splitting our restoration functional in an appropriate way, we have only to deal with special functions namely, several distancelike functions and an indicator functions of a special convex sets. We prove that the reflections of certain distancelike functions on Hadamard manifolds are nonexpansive which is an interesting result on its own. Furthermore, the reflection of the involved indicator function is nonexpansive on Hadamard manifolds with constant curvature so that the DouglasRachford algorithm converges here.
Several numerical examples demonstrate the advantageous performance of the suggested algorithm compared to other existing methods as the cyclic proximal point algorithm or halfquadratic minimization. Numerical convergence is also observed in our experiments on the Hadamard manifold of symmetric positive definite matrices with the affine invariant metric which does not have a constant curvature.@article{BPS16, author = {Bergmann, Ronny and Persch, Johannes and Steidl, Gabriele}, title = {A parallel {D}ouglas–{R}achford algorithm for minimizing {ROF}like functionals on images with values in symmetric {H}adamard manifolds}, journal = {SIAM Journal on Imaging Sciences}, year = {2016}, volume = {9}, number = {3}, pages = {901937}, doi = {10.1137/15M1052858}, eprint = {1512.02814}, eprinttype = {arXiv} } 
Moakher, M and Batchelor, P G (2006). Symmetric PositiveDefinite Matrices: From Geometry to Applications and Visualization. Visualization and Processing of Tensor Fields. Springer Berlin Heidelberg, Berlin, Heidelberg. 285–98
In many engineering applications that use tensor analysis, such as tensor imaging, the underlying tensors have the characteristic of being positive definite. It might therefore be more appropriate to use techniques specially adapted to such tensors. We will describe the geometry and calculus on the Riemannian symmetric space of positivedefinite tensors. First, we will explain why the geometry, constructed by Emile Cartan, is a natural geometry on that space. Then, we will use this framework to present formulas for means and interpolations specific to positivedefinite tensors.@inbook{MB06, author = {Moakher, Maher and Batchelor, Philipp G.}, editor = {Weickert, Joachim and Hagen, Hans}, title = {Symmetric PositiveDefinite Matrices: From Geometry to Applications and Visualization}, booktitle = {Visualization and Processing of Tensor Fields}, year = {2006}, publisher = {Springer Berlin Heidelberg}, address = {Berlin, Heidelberg}, pages = {285298}, doi = {10.1007/3540312722_17} }