ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2410.01265
19
1

Transformers Handle Endogeneity in In-Context Linear Regression

2 October 2024
Haodong Liang
Krishnakumar Balasubramanian
Lifeng Lai
ArXivPDFHTML
Abstract

We explore the capability of transformers to address endogeneity in in-context linear regression. Our main finding is that transformers inherently possess a mechanism to handle endogeneity effectively using instrumental variables (IV). First, we demonstrate that the transformer architecture can emulate a gradient-based bi-level optimization procedure that converges to the widely used two-stage least squares (2SLS)(\textsf{2SLS})(2SLS) solution at an exponential rate. Next, we propose an in-context pretraining scheme and provide theoretical guarantees showing that the global minimizer of the pre-training loss achieves a small excess loss. Our extensive experiments validate these theoretical findings, showing that the trained transformer provides more robust and reliable in-context predictions and coefficient estimates than the 2SLS\textsf{2SLS}2SLS method, in the presence of endogeneity.

View on arXiv
@article{liang2025_2410.01265,
  title={ Transformers Handle Endogeneity in In-Context Linear Regression },
  author={ Haodong Liang and Krishnakumar Balasubramanian and Lifeng Lai },
  journal={arXiv preprint arXiv:2410.01265},
  year={ 2025 }
}
Comments on this paper