Lagrangian uncertainty quantification and information inequalities for
stochastic flows
We develop a systematic information-theoretic framework for quantification and mitigation of error in probabilistic Lagrangian (i.e., trajectory-based) predictions which are obtained from (Eulerian) vector fields generating the underlying dynamical system in a way which naturally applies in both deterministic and stochastic settings. This work is motivated by the desire to improve Lagrangian predictions in complex, multi-scale systems based on simplified, data-driven models. Here, discrepancies between probability measures and associated with the true dynamics and its approximation are quantified via so-called -divergencies, , which are premetrics defined by a class of strictly convex functions . We derive general information bounds on the uncertainty in estimates, , of `true' observables in terms of -divergencies; we then derive two distinct bounds on itself. First, an analytically tractable bound on is derived from differences between vector fields generating the true dynamics and its approximations. The second bound on is based on a difference of so-called finite-time divergence rate (FTDR) fields and it can be exploited within a computational framework to mitigate the error in Lagrangian predictions by tuning the fields of expansion rates obtained from simplified models. This new framework provides a systematic link between Eulerian (field-based) model error and the resulting uncertainty in Lagrangian (trajectory-based) predictions.
View on arXiv