ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1701.07926
48
18
v1v2v3v4v5v6v7v8v9 (latest)

Boosting hazard regression with time-varying covariates

27 January 2017
Donald K. K. Lee
Ningyuan Chen
ArXiv (abs)PDFHTML
Abstract

Consider a left-truncated right-censored survival process whose evolution depends on time-varying covariates. Given functional data samples from the process, we propose a gradient boosting procedure for estimating its log-intensity function in a flexible manner to capture time-covariate interactions. The estimator is shown to be consistent if the model is correctly specified. Alternatively an oracle inequality can be demonstrated for tree-based models. We use the procedure to shed new light on a question from the operations literature concerning the effect of workload on service rates in an emergency department. To avoid overfitting, boosting employs several regularization devices. One of them is step-size restriction, but the rationale for this is somewhat mysterious from the viewpoint of consistency: In theoretical treatments of classification and regression problems, unrestricted greedy step-sizes appear to suffice. Given that the partial log-likelihood functional for hazard regression has unbounded curvature, our study suggests that step-size restriction might be a mechanism for preventing the curvature of the risk from derailing convergence.

View on arXiv
Comments on this paper