ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1611.06173
71
11
v1v2 (latest)

Empirical risk minimization and complexity of dynamical models

18 November 2016
K. Mcgoff
A. Nobel
ArXiv (abs)PDFHTML
Abstract

A dynamical model consists of a continuous self-map T:X→XT: \mathcal{X} \to \mathcal{X}T:X→X of a compact state space X\mathcal{X}X and a continuous observation function f:X→Rf: \mathcal{X} \to \mathbb{R}f:X→R. This paper considers the fitting of a parametrized family of dynamical models to an observed real-valued stochastic process using empirical risk minimization. The limiting behavior of the minimum risk parameters is studied in a general setting. We establish a general convergence theorem for minimum risk estimators and ergodic observations. We then study conditions under which empirical risk minimization can effectively separate the signal from the noise in an additive observational noise model. The key, necessary condition in the latter results is that the family of dynamical models has limited complexity, which is quantified through a notion of entropy for families of infinite sequences. Close connections between entropy and limiting average mean widths for stationary processes are established.

View on arXiv
Comments on this paper