ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1810.08264
11
116

Quantile Regression Under Memory Constraint

18 October 2018
Xi Chen
Weidong Liu
Yichen Zhang
ArXivPDFHTML
Abstract

This paper studies the inference problem in quantile regression (QR) for a large sample size nnn but under a limited memory constraint, where the memory can only store a small batch of data of size mmm. A natural method is the na\"ive divide-and-conquer approach, which splits data into batches of size mmm, computes the local QR estimator for each batch, and then aggregates the estimators via averaging. However, this method only works when n=o(m2)n=o(m^2)n=o(m2) and is computationally expensive. This paper proposes a computationally efficient method, which only requires an initial QR estimator on a small batch of data and then successively refines the estimator via multiple rounds of aggregations. Theoretically, as long as nnn grows polynomially in mmm, we establish the asymptotic normality for the obtained estimator and show that our estimator with only a few rounds of aggregations achieves the same efficiency as the QR estimator computed on all the data. Moreover, our result allows the case that the dimensionality ppp goes to infinity. The proposed method can also be applied to address the QR problem under distributed computing environment (e.g., in a large-scale sensor network) or for real-time streaming data.

View on arXiv
Comments on this paper