Hostname: page-component-848d4c4894-m9kch Total loading time: 0 Render date: 2024-05-10T00:50:49.227Z Has data issue: false hasContentIssue false

ANALYSIS AND APPLICATIONS OF THE RESIDUAL VARENTROPY OF RANDOM LIFETIMES

Published online by Cambridge University Press:  18 March 2020

Antonio Di Crescenzo
Affiliation:
Dipartimento di Matematica, Università degli Studi di Salerno, Via Giovanni Paolo II n. 132, I-84084 Fisciano (SA), Italy E-mail: adicrescenzo@unisa.it
Luca Paolillo
Affiliation:
Dipartimento di Matematica, Università degli Studi di Salerno, Via Giovanni Paolo II n. 132, I-84084 Fisciano (SA), Italy E-mail: adicrescenzo@unisa.it
Rights & Permissions [Opens in a new window]

Abstract

Core share and HTML view are not available for this content. However, as you have access to this content, a full PDF is available via the ‘Save PDF’ action button.

Reliability theory and survival analysis, the residual entropy is known as a measure suitable to describe the dynamic information content in stochastic systems conditional on survival. Aiming to analyze the variability of such information content, in this paper we introduce the variance of the residual lifetimes, “residual varentropy” in short. After a theoretical investigation of some properties of the residual varentropy, we illustrate certain applications related to the proportional hazards model and the first-passage times of an Ornstein–Uhlenbeck jump-diffusion process.

Type
Research Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution, and reproduction in any medium, provided the original work is properly cited.
Copyright
© The Author(s), 2020. Published by Cambridge University Press.

References

1.Arikan, E. (2016). Varentropy decreases under polar transform. IEEE Transactions on Information Theory 62: 33903400.CrossRefGoogle Scholar
2.Arriaza, A., Di Crescenzo, A., Sordo, M.A., & Suárez-Llorens, A. (2019). Shape measures based on the convex transform order. Metrika 82: 99124.CrossRefGoogle Scholar
3.Asadi, M. & Ebrahimi, N. (2000). Residual entropy and its characterizations in terms of hazard function and mean residual life function. Statistics and Probability Letters 49: 263269.CrossRefGoogle Scholar
4.Barlow, R.E. & van Zwet, W. (1970). Asymptotic properties of isotonic estimators for the generalized failure rate function. I. Strong consistency. In Puri, M.L. (ed.), Nonparametric techniques in statistical inference. London: Cambridge University Press, pp. 159176.Google Scholar
5.Bieniek, M. & Szpak, M. (2018). Sharp bounds for the mean of the total time on test for distributions with increasing generalized failure rate. Statistics 52: 818828.CrossRefGoogle Scholar
6.Bobkov, S. & Madiman, M. (2011). Concentration of the information in data with log-concave distributions. Annals of Probability 39: 15281543.CrossRefGoogle Scholar
7.Cacoullos, T. & Papathanasiou, V. (1989). Characterizations of distributions by variance bounds. Statistics and Probability Letters 7: 351356.CrossRefGoogle Scholar
8.Cover, T.M. & Thomas, J.A (1991). Elements of information theory. New York: John Wiley & Sons.CrossRefGoogle Scholar
9.Cox, D.R. (1959). The analysis of exponentially distributed lifetimes with two types of failure. Journal of the Royal Statistical Society, Series B 21: 411421.Google Scholar
10.Cufaro Petroni, N. (2014). Entropy and its discontents: a note on definitions. Entropy 16: 40444059.CrossRefGoogle Scholar
11.Dharmaraja, S., Di Crescenzo, A., Giorno, V., & Nobile, A.G. (2015). A continuous-time Ehrenfest model with catastrophes and its jump-diffusion approximation. Journal of Statistical Physics 161: 326345.CrossRefGoogle Scholar
12.Di Crescenzo, A. & Longobardi, M. (2006). On weighted residual and past entropies. Scientiae Mathematicae Japonicae 64: 255266.Google Scholar
13.Di Crescenzo, A. & Martinucci, B. (2010). A damped telegraph random process with logistic stationary distribution. Journal of Applied Probability 47: 8496.CrossRefGoogle Scholar
14.Di Crescenzo, A. & Pellerey, F. (2019). Some results and applications of geometric counting processes. Methodology and Computing in Applied Probability 21: 203233.CrossRefGoogle Scholar
15.Ebrahimi, N. (1996). How to measure uncertainty in the residual life time distribution. Sankhyā: The Indian Journal of Statistics, Series A 58: 4856.Google Scholar
16.Ebrahimi, N. & Pellerey, F. (1995). New partial ordering of survival functions based on the notion of uncertainty. Journal of Applied Probability 32: 202211.CrossRefGoogle Scholar
17.Fradelizi, M., Madiman, M., & Wang, L. (2016). Optimal concentration of information content for log-concave densities. In Houdré, C., Mason, D., Reynaud-Bouret, P. & Rosiński, J. (eds), High dimensional probability VII. Progress in Probability, vol. 71. Cham: Springer, pp. 4560.CrossRefGoogle Scholar
18.Goodarzi, F., Amini, M., & Borzadaran, G.R.M. (2017). Characterizations of continous distributions through inequalities involving the expected values of selected functions. Applications of Mathematics 62: 493507.CrossRefGoogle Scholar
19.Gupta, R.C. (2006). Variance residual life function in reliability studies. Metron 64: 343355.Google Scholar
20.Hall, W.J. & Wellner, J.A. (1981). Mean residual life. In Csörgö, M., Dawson, D.A., Rao, J.N.K., & Saleh, A.K.Md.E. (eds), Statistics and related topics. North-Holland, pp. 169184.Google Scholar
21.Kontoyiannis, I. & Verdú, S. (2013). Optimal lossless compression: source varentropy and dispersion. In IEEE International Symposium on Information Theory, Istanbul, pp. 17391743.Google Scholar
22.Kontoyiannis, I. & Verdú, S. (2014). Optimal lossless data compression: non-asymptotics and asymptotics. IEEE Transactions on Information Theory 60: 777795.CrossRefGoogle Scholar
23.Kusmierz, L., Majumdar, S.N., Sabhapandit, S., & Schehr, G. (2014). First order transition for the optimal search time of Lévy flights with resetting. Physical Review Letters 113: 220602.CrossRefGoogle ScholarPubMed
24.Lariviere, M.A. & Porteus, E.L. (2001). Selling to the newsvendor: an analysis of price-only contracts. Manufacturing & Service Operations Management 3: 293305.CrossRefGoogle Scholar
25.Li, Z. & Tewari, A. (2018). Beyond the hazard rate: more perturbation algorithms for adversarial multi-armed bandits. Journal of Machine Learning Research 18: 124.Google Scholar
26.Li, J., Fradelizi, M., & Madiman, M. (2016). Information concentration for convex measures. In IEEE International Symposium on Information Theory, Barcelona, pp.11281132.Google Scholar
27.Madiman, M. & Barron, A. (2007). Generalized entropy power inequalities and monotonicity properties of information. IEEE Transactions on Information Theory 53: 23172329.CrossRefGoogle Scholar
28.Madiman, M. & Wang, L. (2014). An optimal varentropy bound for log-concave distributions. In International Conference on Signal Processing and Communications (SPCOM). Bangalore: Indian Institute of Science, 1 p. doi:10.1109/SPCOM.2014.6983953.Google Scholar
29.Maoui, I., Ayhan, H., & Foley, R. (2007). Congestion-dependent pricing in a stochastic service system. Advances in Applied Probability 39: 898921.CrossRefGoogle Scholar
30.Muliere, P., Parmigiani, G., & Polson, N.G. (1993). A note on the residual entropy function. Probability in the Engineering and Informational Sciences 7: 413420.CrossRefGoogle Scholar
31.Nanda, A.K. & Chowdhury, S. (2019). Shannon's entropy and its generalizations towards statistics. Reliability and information science during 1948–2018, 18 pp. arXiv:1901.09779v1.Google Scholar
32.Pal, A. (2015). Diffusion in a potential landscape with stochastic resetting. Physical Review E 91: 012113.CrossRefGoogle Scholar
33.Parsa, M., Di Crescenzo, A., & Jabbari, H. (2018). Analysis of reliability systems via Gini-type index. European Journal of Operational Research 264: 340353.CrossRefGoogle Scholar
34.Sachlas, A. & Papaioannou, T. (2014). Residual and past entropy in actuarial science and survival models. Methodology and Computing in Applied Probability 16: 7999.CrossRefGoogle Scholar
35.Schroeder, M.J. (2004). An alternative to entropy in the measurement of information. Entropy 6: 388412.CrossRefGoogle Scholar
36.Schweizer, N. & Szech, N. (2015). A quantitative version of Myerson regularity. Working Paper Series in Economics 76, Karlsruhe Institute of Technology (KIT), Department of Economics and Business Engineering, 18 pp.Google Scholar
37.Shannon, C.E. (1948). A mathematical theory of communication. Bell System Technical Journal 27: 379423, 623–656.CrossRefGoogle Scholar
38.Verdú, S., Kontoyiannis, I. (2012). Lossless data compression rate: asymptotics and non-asymptotics. In 46th Annual Conference on Information Sciences and Systems (CISS), Princeton, NJ, 6 pp. doi:10.1109/CISS.2012.6310950CrossRefGoogle Scholar