Estimators derived from a divergence criterion such as divergences are generally more robust than the maximum likelihood ones. We are interested in particular in the so-called MDDE, an estimator built using a dual representation of --divergences. We present in this paper an iterative proximal point algorithm which permits to calculate such estimator. This algorithm contains by its construction the well-known EM algorithm. Our work is based on the paper of \citep{Tseng} on the likelihood function. We provide several convergence properties of the sequence generated by the algorithm, and improve the existing results by relaxing the identifiability condition on the proximal term, a condition which is not verified for most mixture models and hard to be verified for non mixture ones. Since convergence analysis uses regularity conditions (continuity and differentiability) of the objective function, which has a supremal form, we find it useful to present some analytical approaches for studying such functions. Convergence of the EM algorithm is discussed here again in a Gaussian and Weibull mixtures in the spirit of our approach. Simulations are provided to confirm the validity of our work and the robustness of the resulting estimators against outliers.
View on arXiv