In the first part of the talk, we present some statistical elements to model curves and surfaces through tensors, inspired by the efficiency of the Tensor Voting framework. We show that a unit trace tensor can encode the mean and the covariance of a random direction. Thus, a Tensor fields may be considered as a discrete encoding of the pdf of an oriented point. Then, we investigate what an oriented points tells about the neighboring surface (the equivalent of the voting field). Empirical evidences led us to propose stochastic curve models that could be partially generalized to surfaces. Very preliminary experiments on the registration of 3D surfaces showed that these models seem to capture quite well the implicit information available on the surface.
In the second part of the talk, we present a framework we developed recently to work on the manifold of positive define symmetric matrices (tensors) in an intrinsic way. The basic idea is to provide the tensor space with an affine-invariant Riemannian metric. We demonstrate that it leads to strong theoretical properties: the cone of positive definite symmetric matrices is replaced by a regular manifold of constant curvature without boundaries (null eigenvalues are at the infinity), the geodesic between two tensors and the mean of a set of tensors are uniquely defined, etc. On the basis of that metric, and using a few associated differential geometry tools, we show how to generalize to tensor fields many important geometric data processing algorithms such as interpolation, filtering, diffusion and restoration of missing data. We present some applications of that framework to the regularization of Diffusion Tensor MR images regularization and to the modeling of the variability of the brain.