ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1903.10474
21
4
v1v2v3 (latest)

Active Learning of Spin Network Models

25 March 2019
Jialong Jiang
David A. Sivak
Matt Thomson
ArXiv (abs)PDFHTML
Abstract

The inverse statistical problem of finding direct interactions in complex networks is difficult. In the context of the experimental sciences, well-controlled perturbations can be applied to a system, probing the internal structure of the network. Therefore, we propose a general mathematical framework to study inference with iteratively applied perturbations to a network. Formulating active learning in the language of information geometry, our framework quantifies the difficulty of inference as well as the information gain due to perturbations through the curvature of the underlying parameter manifold as measured though the empirical Fisher information. Perturbations are then chosen that reduce most the variance of the Bayesian posterior. We apply this framework to a specific probabilistic graphical model where the nodes in the network are modeled as binary variables, "spins" with Ising-form pairwise interactions. Based on this strategy, we significantly improve the accuracy and efficiency of inference from a reasonable number of experimental queries for medium sized networks. Our active learning framework could be powerful in the analysis of complex networks as well as in the rational design of experiments.

View on arXiv
Comments on this paper