Some Results of Extropy-Based Measure of Divergence for Record Statistics

Message:
Article Type:
Research/Original Article (دارای رتبه معتبر)
Abstract:
It is interesting to learn that the complementary dual of the Shannon entropy measure exists and has some common properties. This new measure of uncertainty has been introduced by Lad et al. (2015) and is known as extropy. Although there are some mathematical analogies between the two measures, extropy typically has different uses and interpretations than entropy. Taking into account the importance of extropy measure, and its various generalizations, in the present communication, we consider and study Kullback-Leibler based "divergence-extropy" measure between the distribution of nth upper k-record and mth upper k-record values. Characterization problems for the proposed "divergence-extropy" measure have been studied. Further, some specific lifetime distributions used in lifetime testing, physical sciences, survival analysis and reliability engineering have been studied using the proposed "divergence-extropy" measure. At the end, we study the proposed "divergence-extropy" measure between the distribution of k-record value and order statistics.
Language:
English
Published:
Journal of Iranian Statistical Society, Volume:23 Issue: 1, Spring 2024
Pages:
83 to 98
https://www.magiran.com/p2794102