ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1701.07926
39
18
v1v2v3v4v5v6v7v8v9 (latest)

Boosted nonparametric hazards with time-dependent covariates

27 January 2017
Donald K. K. Lee
Ningyuan Chen
H. Ishwaran
ArXiv (abs)PDFHTML
Abstract

Given functional data samples from a survival process with time-dependent covariates, we propose a functional gradient boosting procedure for estimating its hazard function nonparametrically. The estimator is consistent if the model is correctly specified; alternatively an oracle inequality can be demonstrated for tree-based models. To avoid overfitting, boosting employs several regularization devices. One of them is step-size restriction, but the rationale for this is somewhat mysterious from the viewpoint of consistency. Our convergence bounds bring some clarity to this issue by revealing that step-size restriction is a mechanism for preventing the curvature of the risk from derailing convergence.

View on arXiv
Comments on this paper