91

One-parameter statistical model for linear stochastic differential equation with time delay

Abstract

Assume that we observe a stochastic process (X(t))t[r,T](X(t))_{t\in[-r,T]}, which satisfies the linear stochastic delay differential equation \[ \mathrm{d} X(t) = \vartheta \int_{[-r,0]} X(t + u) \, a(\mathrm{d} u) \, \mathrm{d} t + \mathrm{d} W(t) , \qquad t \geq 0 , \] where aa is a finite signed measure on [r,0][-r, 0]. The local asymptotic properties of the likelihood function are studied. Local asymptotic normality is proved in case of vϑ<0v_\vartheta^* < 0, local asymptotic quadraticity is shown if vϑ=0v_\vartheta^* = 0, and, under some additional conditions, local asymptotic mixed normality or periodic local asymptotic mixed normality is valid if vϑ>0v_\vartheta^* > 0, where vϑv_\vartheta^* is an appropriately defined quantity. As an application, the asymptotic behaviour of the maximum likelihood estimator ϑ^T\widehat{\vartheta}_T of ϑ\vartheta based on (X(t))t[r,T](X(t))_{t\in[-r,T]} can be derived as TT \to \infty.

View on arXiv
Comments on this paper