Statistical inference based on divergence measures have a long history. Recently, Maji, Ghosh and Basu (2014) have introduced a general family of divergences called the logarithmic super divergence (LSD) family. This family acts as a superfamily for both of the logarithmic power divergence (LPD) family (eg. Renyi, 1961) and the logarithmic density power divergence (LDPD)family introduced by Jones et al. (2001). In this paper we describe the asymptotic properties of the inference procedures resulting from this divergence in discrete models. The properties are well supported by real data examples.
View on arXiv