Abstract:This paper introduces a data-dependent approximation of the forward kinematics map for certain types of animal motion models. It is assumed that motions are supported on a low-dimensional, unknown configuration manifold $Q$ that is regularly embedded in high dimensional Euclidean space $X:=\mathbb{R}^d$. This paper introduces a method to estimate forward kinematics from the unknown configuration submanifold $Q$ to an $n$-dimensional Euclidean space $Y:=\mathbb{R}^n$ of observations. A known reproducing kernel Hilbert space (RKHS) is defined over the ambient space $X$ in terms of a known kernel function, and computations are performed using the known kernel defined on the ambient space $X$. Estimates are constructed using a certain data-dependent approximation of the Koopman operator defined in terms of the known kernel on $X$. However, the rate of convergence of approximations is studied in the space of restrictions to the unknown manifold $Q$. Strong rates of convergence are derived in terms of the fill distance of samples in the unknown configuration manifold, provided that a novel regularity result holds for the Koopman operator. Additionally, we show that the derived rates of convergence can be applied in some cases to estimates generated by the extended dynamic mode decomposition (EDMD) method. We illustrate characteristics of the estimates for simulated data as well as samples collected during motion capture experiments.
Abstract:This paper describes the formulation and experimental testing of a novel method for the estimation and approximation of submanifold models of animal motion. It is assumed that the animal motion is supported on a configuration manifold $Q$ that is a smooth, connected, regularly embedded Riemannian submanifold of Euclidean space $X\approx \mathbb{R}^d$ for some $d>0$, and that the manifold $Q$ is homeomorphic to a known smooth, Riemannian manifold $S$. Estimation of the manifold is achieved by finding an unknown mapping $\gamma:S\rightarrow Q\subset X$ that maps the manifold $S$ into $Q$. The overall problem is cast as a distribution-free learning problem over the manifold of measurements $\mathbb{Z}=S\times X$. That is, it is assumed that experiments generate a finite sets $\{(s_i,x_i)\}_{i=1}^m\subset \mathbb{Z}^m$ of samples that are generated according to an unknown probability density $\mu$ on $\mathbb{Z}$. This paper derives approximations $\gamma_{n,m}$ of $\gamma$ that are based on the $m$ samples and are contained in an $N(n)$ dimensional space of approximants. The paper defines sufficient conditions that shows that the rates of convergence in $L^2_\mu(S)$ correspond to those known for classical distribution-free learning theory over Euclidean space. Specifically, the paper derives sufficient conditions that guarantee rates of convergence that have the form $$\mathbb{E} \left (\|\gamma_\mu^j-\gamma_{n,m}^j\|_{L^2_\mu(S)}^2\right )\leq C_1 N(n)^{-r} + C_2 \frac{N(n)\log(N(n))}{m}$$for constants $C_1,C_2$ with $\gamma_\mu:=\{\gamma^1_\mu,\ldots,\gamma^d_\mu\}$ the regressor function $\gamma_\mu:S\rightarrow Q\subset X$ and $\gamma_{n,m}:=\{\gamma^1_{n,j},\ldots,\gamma^d_{n,m}\}$.