Minimum-divergence procedures based on density power divergence and log density power divergence are very popular and have been successful in producing inference procedures that combine high model efficiency with strong outlier stability. Such procedures are always preferred in real-world situations to those that sacrifice a lot of efficiency for robustness, or those that are very efficient but less robust. The Density Power Divergence (DPD) family of Basu et al. (1998) and the Logarithmic Density Power Divergence (LDPD) family of Jones et al. . The usefulness of these two families of differences in statistical inference makes it meaningful to search other related families of differences in the same spirit. The DPD family is a member of the class of Bregman divergences and the LDPD family is obtained by logarithmic transformation of various segments of divergence within his DPD family. Both the DPD and LDPD families cause Kullback-Leibler divergence in limited cases like the tuning parameter $\alpha \rightarrow 0$. In this paper, we study this relationship in detail and show that such logarithmic transformations only make sense in the context of his DPD (or the convex generating function of the DPD) within the general fold of Bregman divergence . The degree to which the search for a useful divergence is likely to succeed.

Source link


Leave A Reply