126
v1v2 (latest)

Learning Theory for Estimation of Animal Motion Submanifolds

IEEE Conference on Decision and Control (CDC), 2020
Abstract

This paper describes the formulation and experimental testing of a novel method for the estimation and approximation of submanifold models of animal motion. It is assumed that the animal motion is supported on a configuration manifold QQ that is a smooth, connected, regularly embedded Riemannian submanifold of Euclidean space XRdX\approx \mathbb{R}^d for some d>0d>0, and that the manifold QQ is homeomorphic to a known smooth, Riemannian manifold SS. Estimation of the manifold is achieved by finding an unknown mapping γ:SQX\gamma:S\rightarrow Q\subset X that maps the manifold SS into QQ. The overall problem is cast as a distribution-free learning problem over the manifold of measurements Z=S×X\mathbb{Z}=S\times X. That is, it is assumed that experiments generate a finite sets {(si,xi)}i=1mZm\{(s_i,x_i)\}_{i=1}^m\subset \mathbb{Z}^m of samples that are generated according to an unknown probability density μ\mu on Z\mathbb{Z}. This paper derives approximations γn,m\gamma_{n,m} of γ\gamma that are based on the mm samples and are contained in an N(n)N(n) dimensional space of approximants. The paper defines sufficient conditions that shows that the rates of convergence in Lμ2(S)L^2_\mu(S) correspond to those known for classical distribution-free learning theory over Euclidean space. Specifically, the paper derives sufficient conditions that guarantee rates of convergence that have the form E(γμjγn,mjLμ2(S)2)C1N(n)r+C2N(n)log(N(n))m\mathbb{E} \left (\|\gamma_\mu^j-\gamma_{n,m}^j\|_{L^2_\mu(S)}^2\right )\leq C_1 N(n)^{-r} + C_2 \frac{N(n)\log(N(n))}{m}for constants C1,C2C_1,C_2 with γμ:={γμ1,,γμd}\gamma_\mu:=\{\gamma^1_\mu,\ldots,\gamma^d_\mu\} the regressor function γμ:SQX\gamma_\mu:S\rightarrow Q\subset X and γn,m:={γn,j1,,γn,md}\gamma_{n,m}:=\{\gamma^1_{n,j},\ldots,\gamma^d_{n,m}\}.

View on arXiv
Comments on this paper