Optimal Prediction in an Additive Functional Model

The functional generalized additive model (FGAM) provides a more flexible nonlinear functional regression model than the well-studied functional linear regression model. This paper restricts attention to the FGAM with identity link and additive errors, which we will call the additive functional model, a generalization of the functional linear model. This paper studies the minimax rate of convergence of predictions from the additive functional model in the framework of reproducing kernel Hilbert space. It is shown that the optimal rate is determined by the decay rate of the eigenvalues of a specific kernel function, which in turn is determined by the reproducing kernel and the joint distribution of any two points in the random predictor function. For the special case of the functional linear model, this kernel function is jointly determined by the covariance function of the predictor function and the reproducing kernel. The easily implementable roughness-regularized predictor is shown to achieve the optimal rate of convergence. Numerical studies are carried out to illustrate the merits of the predictor. Our simulations and real data examples demonstrate a competitive performance against the existing approach.
View on arXiv