##
**Equations of motion from a data series.**
*(English)*
Zbl 0675.58026

This excellent article reviews and unifies various techniques developed during the last years for reconstructing dynamical systems from observed data. The guiding philosophy is similar to that of maximum likelihood summarization in parametric statistics (opposed to just ML estimation) as outlined in [B. Efron, Ann. Stat. 10, 340-356 (1982; Zbl 0494.62004)]: From a parametrized family of dynamical systems one should choose the one “closest” to the observed data. In the present context “closest” means that it minimizes the one-step prediction error variance \(\sigma^ 2_{obs}.\)

Since it is a priori unknown whether quadratic, cubic or higher order terms are necessary for a realistic model of the observed dynamics and what the state space dimension \(m_{bed}\) of a realistic model should be, and since higher approximation orders and higher state space dimensions usually result in better approximations at the cost of higher model complexity, one modifies the above philosophy and chooses a model which minimizes the model entropy which is defined as the sum of \(\sigma^ 2_{bed}\) and a suitable numerical complexity measure of the model. (Observe that via \(\sigma^ 2_{bed}\) this quantity is data dependent.)

The steps in this procedure are: 1) Choose a procedure transforming data to points in a state space. 2) Estimate an upper bound for the embedding dimension \(m_{bed}\). 3) Find a model for the dynamics minimizing the model entropy. 4) Use this model to compute (usually by simulation) relevant dynamical quantities like the Lyapunov spectrum, the information dimension and the metric entropy. 5) Use these quantities and \(\sigma^ 2_{obs}\) to estimate the extrinsic noise level of the observed dynamics.

In the second part of the paper the authors present numerical examples.

Since it is a priori unknown whether quadratic, cubic or higher order terms are necessary for a realistic model of the observed dynamics and what the state space dimension \(m_{bed}\) of a realistic model should be, and since higher approximation orders and higher state space dimensions usually result in better approximations at the cost of higher model complexity, one modifies the above philosophy and chooses a model which minimizes the model entropy which is defined as the sum of \(\sigma^ 2_{bed}\) and a suitable numerical complexity measure of the model. (Observe that via \(\sigma^ 2_{bed}\) this quantity is data dependent.)

The steps in this procedure are: 1) Choose a procedure transforming data to points in a state space. 2) Estimate an upper bound for the embedding dimension \(m_{bed}\). 3) Find a model for the dynamics minimizing the model entropy. 4) Use this model to compute (usually by simulation) relevant dynamical quantities like the Lyapunov spectrum, the information dimension and the metric entropy. 5) Use these quantities and \(\sigma^ 2_{obs}\) to estimate the extrinsic noise level of the observed dynamics.

In the second part of the paper the authors present numerical examples.

Reviewer: G.Keller

### MSC:

37D45 | Strange attractors, chaotic dynamics of systems with hyperbolic behavior |

37C75 | Stability theory for smooth dynamical systems |

93E12 | Identification in stochastic control theory |

37C70 | Attractors and repellers of smooth dynamical systems and their topological structure |