ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2007.10612
32
15
v1v2v3 (latest)

Backfitting for large scale crossed random effects regressions

21 July 2020
Swarnadip Ghosh
Trevor Hastie
Art B. Owen
ArXiv (abs)PDFHTML
Abstract

Regression models with crossed random effect errors can be very expensive to compute. The cost of both generalized least squares and Gibbs sampling can easily grow as N3/2N^{3/2}N3/2 (or worse) for NNN observations. Papaspiliopoulos et al. (2020) present a collapsed Gibbs sampler that costs O(N)O(N)O(N), but under an extremely stringent sampling model. We propose a backfitting algorithm to compute a generalized least squares estimate and prove that it costs O(N)O(N)O(N). A critical part of the proof is in ensuring that the number of iterations required is O(1)O(1)O(1) which follows from keeping a certain matrix norm below 1−δ1-\delta1−δ for some δ>0\delta>0δ>0. Our conditions are greatly relaxed compared to those for the collapsed Gibbs sampler, though still strict. Empirically, the backfitting algorithm has a norm below 1−δ1-\delta1−δ under conditions that are less strict than those in our assumptions. We illustrate the new algorithm on a ratings data set from Stitch Fix.

View on arXiv
Comments on this paper