Restricted distance-type Gaussian estimators based on density power divergence and their aplications in hypothesis testing

Zhang (2019) presented a general estimation approach based on the Gaussian distribution for general parametric models where the likelihood of the data is difficult to obtain or unknown, but the mean and variance-covariance matrix are known. Castilla and Zografos (2021) extended the method to density power divergence-based estimators, which are more robust than the likelihood-based Gaussian estimator against data contamination. Here, we present the restricted minimum density power divergence Gaussian estimator (MDPDGE) and study it asymptotic and robustness properties through it asymptotic distribution and influence function, respectively. Restricted estimators are required in many practical situations and provide here constrained estimators to inherent restrictions of the underlying distribution. Further, we derive robust Rao-type test statistics based on the MDPDGE for testing composite null hypothesis and we deduce explicit expression for some main important distributions. Finally, we empirically evaluate the efficiency and robustness of the methodthrough a simulation study.
View on arXiv