ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1911.07142
42
7
v1v2 (latest)

Bayesian Model Selection for High-Dimensional Ising Models, With Applications to Educational Data

17 November 2019
Jaewoo Park
Ick Hoon Jin
M. Schweinberger
ArXiv (abs)PDFHTML
Abstract

Doubly-intractable posterior distributions arise in many applications of statistics concerned with discrete and dependent data, including physics, spatial statistics, machine learning, the social sciences, and other fields. A specific example is psychometrics, which has adapted high-dimensional Ising models from machine learning, with a view to studying the interactions among binary item responses in educational assessments. To estimate high-dimensional Ising models from educational assessment data, ℓ1\ell_1ℓ1​-penalized nodewise logistic regressions have been used. Theoretical results in high-dimensional statistics show that ℓ1\ell_1ℓ1​-penalized nodewise logistic regressions can recover the true interaction structure with high probability, provided that certain assumptions are satisfied. Those assumptions are hard to verify in practice and may be violated, and quantifying the uncertainty about the estimated interaction structure and parameter estimators is challenging. We propose a Bayesian approach that helps quantify the uncertainty about the interaction structure and parameters without requiring strong assumptions, and can be applied to Ising models with thousands of parameters. We demonstrate the advantages of the proposed Bayesian approach compared with ℓ1\ell_1ℓ1​-penalized nodewise logistic regressions by simulation studies and applications to small and large educational data sets with up to 2,485 parameters. Among other things, the simulation studies suggest that the Bayesian approach is more robust against model misspecification due to omitted covariates than ℓ1\ell_1ℓ1​-penalized nodewise logistic regressions.

View on arXiv
Comments on this paper