Hostname: page-component-76fb5796d-vvkck Total loading time: 0 Render date: 2024-04-29T13:24:50.503Z Has data issue: false hasContentIssue false

Extropy: Characterizations and dynamic versions

Published online by Cambridge University Press:  02 June 2023

Abdolsaeed Toomaj*
Affiliation:
Gonbad Kavous University
Majid Hashempour*
Affiliation:
University of Hormozgan
Narayanaswamy Balakrishnan*
Affiliation:
McMaster University
*
*Postal address: Faculty of Basic Sciences and Engineering, Department of Mathematics and Statistics, Gonbad Kavous University, Gonbad Kavous, Iran. Emails: ab.toomaj@gonbad.ac.ir, ab.toomaj@gmail.com
**Postal address: Department of Statistics, Faculty of Basic Sciences, University of Hormozgan, Bandar Abbas, Iran. Email: ma.hashempour@hormozgan.ac.ir
***Postal address: Department of Mathematics and Statistics, McMaster University, Hamilton, ON L8S 4K1, Canada. Email: bala@mcmaster.ca

Abstract

Several information measures have been proposed and studied in the literature. One such measure is extropy, a complementary dual function of entropy. Its meaning and related aging notions have not yet been studied in great detail. In this paper, we first illustrate that extropy information ranks the uniformity of a wide array of absolutely continuous families. We then discuss several theoretical merits of extropy. We also provide a closed-form expression of it for finite mixture distributions. Finally, the dynamic versions of extropy are also discussed, specifically the residual extropy and past extropy measures.

Type
Original Article
Copyright
© The Author(s), 2023. Published by Cambridge University Press on behalf of Applied Probability Trust

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Arnold, B. C., Balakrishnan, N. and Nagaraja, H. N. (2008). A First Course in Order Statistics. SIAM, Philadelphia, PA.CrossRefGoogle Scholar
Arnold, B. C., Balakrishnan, N. and Nagaraja, H. N. (2011). Records. Wiley, Chichester.Google Scholar
Asadi, M. and Ebrahimi, N. (2000). Residual entropy and its characterizations in terms of hazard function and mean residual life function. Statist. Prob. Lett. 49, 263269.CrossRefGoogle Scholar
Asadi, M., Ebrahimi, N., Hamedani, G. and Soofi, E. S. (2004). Maximum dynamic entropy models. J. Appl. Prob. 41, 379390.Google Scholar
Asadi, M., Ebrahimi, N., Hamedani, G. and Soofi, E. S. (2005). Minimum dynamic discrimination information models. J. Appl. Prob. 42, 643660.CrossRefGoogle Scholar
Asadi, M., Ebrahimi, N., Kharazmi, O. and Soofi, E. S. (2018). Mixture models, Bayes Fisher information, and divergence measures. IEEE Trans. Inf. Theory 65, 23162321.CrossRefGoogle Scholar
Ebrahimi, N., Maasoumi, E. and Soofi, E. S. (1999). Ordering univariate distributions by entropy and variance. J. Econometrics 90, 317336.CrossRefGoogle Scholar
Ebrahimi, N., Soofi, E. S. and Soyer, R. (2010). Information measures in perspective. Int. Statist. Rev. 78, 383412.Google Scholar
Ebrahimi, N., Soofi, E. S. and Zahedi, H. (2004). Information properties of order statistics and spacings. IEEE Trans. Inf. Theory 50, 177183.CrossRefGoogle Scholar
Good, I. (1968). Utility of a distribution. Nature 219, 13921392.CrossRefGoogle Scholar
Gupta, R. C. and Kirmani, S. (1988). Closure and monotonicity properties of nonhomogeneous Poisson processes and record values. Prob. Eng. Inf. Sci. 2, 475484.CrossRefGoogle Scholar
Hild, K. E., Pinto, D., Erdogmus, D. and Principe, J. C. (2005). Convolutive blind source separation by minimizing mutual information between segments of signals. IEEE Trans. Circuits Systems 52, 21882196.CrossRefGoogle Scholar
Jaynes, E. T. (1982). On the rationale of maximum-entropy methods. Proc. IEEE 70, 939–952.CrossRefGoogle Scholar
Kharazmi, O. and Balakrishnan, N. (2021). Cumulative residual and relative cumulative residual Fisher information and their properties. IEEE Trans. Inf. Theory 67, 63066312.CrossRefGoogle Scholar
Krishnan, A. S., Sunoj, S. and Unnikrishnan Nair, N. (2020). Some reliability properties of extropy for residual and past lifetime random variables. J. Korean Statist. Soc. 49, 457474.CrossRefGoogle Scholar
Kullback, S. (1997). Information Theory and Statistics. Courier Corporation, North Chelmsford, MA.Google Scholar
Lad, F. et al. (2015). Extropy: Complementary dual of entropy. Statist. Sci. 30, 4058.CrossRefGoogle Scholar
Michalowicz, J. V., Nichols, J. M. and Bucholtz, F. (2008). Calculation of differential entropy for a mixed Gaussian distribution. Entropy 10, 200206.CrossRefGoogle Scholar
Nakagawa, T. (2006). Maintenance Theory of Reliability. Springer, New York.Google Scholar
Nanda, A. K. and Jain, K. (1999). Some weighted distribution results on univariate and bivariate cases. J. Statist. Planning Infer. 77, 169180.CrossRefGoogle Scholar
Qiu, G. and Jia, K. (2018). Extropy estimators with applications in testing uniformity. J. Nonparametric Statist. 30, 182196.CrossRefGoogle Scholar
Qiu, G. and Jia, K. (2018). The residual extropy of order statistics. Statist. Prob. Lett. 133, 1522.CrossRefGoogle Scholar
Rohde, G., Nichols, J., Bucholtz, F. and Michalowicz, J. (2007). Signal estimation based on mutual information maximization. In 2007 Conference Record of the Forty-First Asilomar Conference on Signals, Systems and Computers. IEEE, New York, pp. 597–600.CrossRefGoogle Scholar
Shaked, M. and Shanthikumar, J. G. (2007). Stochastic Orders. Springer, New York.CrossRefGoogle Scholar
Shannon, C. E. (1948). A mathematical theory of communication. Bell Syst. Tech. J. 27, 379423.Google Scholar
Soofi, E. S., Ebrahimi, N. and Habibullah, M. (1995). Information distinguishability with application to analysis of failure data. J. Amer. Statist. Assoc. 90, 657668.CrossRefGoogle Scholar
Tan, Y., Tantum, S. L. and Collins, L. M. (2004). Cramèr–Rao lower bound for estimating quadrupole resonance signals in non-Gaussian noise. IEEE Sig. Proc. Lett. 11, 490493.Google Scholar
Yang, J., Xia, W. and Hu, T. (2019). Bounds on extropy with variational distance constraint. Prob. Eng. Inf. Sci. 33, 186204.Google Scholar
Yuan, A. and Clarke, B. (1999). An information criterion for likelihood selection. IEEE Trans. Inf. Theory 45, 562571.CrossRefGoogle Scholar