ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1212.3647
131
22
v1v2v3v4 (latest)

Maximally informative models and diffeomorphic modes in problems of parametric inference

15 December 2012
J. Kinney
G. Atwal
ArXiv (abs)PDFHTML
Abstract

Motivated by data-rich experiments in transcriptional regulation and sensory neuroscience, we consider the following general problem in statistical inference. A system of interest, when exposed to a stimulus S, adopts a deterministic response R of which a noisy measurement M is made. Given a large number of measurements and corresponding stimuli, we wish to identify the correct "response function" relating R to S. However the "noise function" relating M to R is unknown a priori. Here we show that maximizing likelihood over both response functions and noise functions is equivalent to simply identifying maximally informative response functions -- ones that maximize the mutual information I[R;M] between predicted responses and corresponding measurements. Moreover, if the correct response function is in the class of models being explored, maximizing mutual information becomes equivalent to simultaneously maximizing every dependence measure that satisfies the Data Processing Inequality. We note that experiments of the type considered are unable to distinguish between parametrized response functions lying along certain "diffeomorphic modes" in parameter space. We show how to derive these diffeomorphic modes and observe, fortunately, that such modes typically span a very low-dimensional subspace of parameter space. Therefore, given sufficient data, maximizing mutual information can pinpoint nearly all response function parameters without requiring any model of experimental noise.

View on arXiv
Comments on this paper