Hostname: page-component-848d4c4894-nr4z6 Total loading time: 0 Render date: 2024-05-10T01:44:18.930Z Has data issue: false hasContentIssue false

Relationships between cumulative entropy/extropy, Gini mean difference and probability weighted moments

Published online by Cambridge University Press:  18 January 2023

Sudheesh K. Kattumannil
Affiliation:
Indian Statistical Institute, Chennai, Tamil Nadu, India. E-mail: skkattu@isichennai.res.in
E. P. Sreedevi
Affiliation:
Maharaja's College, Eranakulam, Kerala, India
N. Balakrishnan
Affiliation:
McMaster University, Hamilton, ON L8S 4L8, Canada
Rights & Permissions [Opens in a new window]

Abstract

In this work, we establish a connection between the cumulative residual entropy and the Gini mean difference (GMD). Some relationships between the extropy and the GMD, and the truncated GMD and dynamic versions of the cumulative past extropy are also established. We then show that several entropy and extropy measures discussed here can be brought into the framework of probability weighted moments, which would facilitate finding estimators of these measures.

Type
Research Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
Copyright © The Author(s), 2023. Published by Cambridge University Press

1. Introduction

The Gini mean difference (GMD) is a prominent measure that is used extensively in economics. Let $X$ be a non-negative random variable with absolutely continuous distribution function $F$ and finite mean $\mu$, and $X_1$ and $X_2$ be two independent random variables from $F$. Then, the GMD is defined as

$${\rm GMD}=E|X_1-X_2|.$$

For more details on GMD and other measures derived from it, interested readers may refer to Yitzhaki and Schechtman [Reference Yitzhaki and Schechtman33].

The right-truncated version of the GMD has also been discussed for examining inequality between the poor and affluent groups, while the left-truncated version has found application in reliability analysis. For more details, see Nair et al. [Reference Nair, Sankaran and Vineshkumar21] and Behdani et al. [Reference Behdani, Mohtashami Borzadaran and Sadeghpour Gildeh5]. Recently, Nair and Vineshkumar [Reference Nair and Vineshkumar22] discussed some relationships between the cumulative past entropy and income gap ratio, Lorenz curve, Gini index, Bonferroni curve and Zenga curve.

In many practical situations, measuring the uncertainty associated with a random variable is quite important, and many measures have been introduced for this purpose. The seminal work on information theory started with the concept of Shannon entropy or differential entropy introduced by Shannon [Reference Shannon26]. Since then, different measures of entropy have been discussed, each one being suitable for some specific situation. Some of the widely used measures of entropy are cumulative residual entropy [Reference Rao, Chen, Vemuri and Wang24], cumulative past entropy [Reference Di Crescenzo and Longobardi12], the corresponding weighed measures by Mirali et al. [Reference Mirali, Baratpour and Fakoor19] and Mirali and Baratpour [Reference Mirali and Baratpour18], and a general measure of cumulative residual entropy [Reference Sudheesh, Sreedevi and Balakrishnan29].

One important generalization of Shannon entropy is due to Tsallis [Reference Tsallis31], known as generalized Tsallis entropy of order $\alpha$. Many extensions and modifications have been developed for it as well. For example, Rajesh and Sunoj [Reference Rajesh and Sunoj23] proposed cumulative residual Tsallis entropy of order $\alpha$, while Chakraborty and Pradhan [Reference Chakraborty and Pradhan7] defined weighted cumulative residual Tsallis entropy (WCRTE) of order $\alpha$ and its dynamic version. Calì et al. [Reference Calì, Longobardi and Ahmadi6] introduced cumulative past Tsallis entropy of order $\alpha$. Chakraborty and Pradhan [Reference Chakraborty and Pradhan8] introduced weighted cumulative past Tsallis entropy (WCTE) of order $\alpha$, and also studied its dynamic version.

An alternative measure of uncertainty, called extropy, was introduced by Lad et al. [Reference Lad, Sanfilippo and Agro16] as a complementary dual of entropy. Jahanshahi et al. [Reference Jahanshahi, Zarei and Khammar14] and Tahmasebi and Toomaj [Reference Tahmasebi and Toomaj30] studied cumulative residual extropy and negative cumulative extropy, while Balakrishnan et al. [Reference Balakrishnan, Buono and Longobardi4] and Chakraborty and Pradhan [Reference Chakraborty and Pradhan9] discussed different weighted versions of extropy. Sudheesh and Sreedevi [Reference Sudheesh and Sreedevi28] established some relationships between different extropy measures and reliability concepts, and Sudheesh et al. [Reference Sudheesh, Sreedevi and Balakrishnan29] established some relationships between entropy and extropy measures.

The probability weighted moments (PWMs) generalize the concept of moments of a probability distribution, and they have been used effectively for estimating the parameters of different distributions. The PWM was introduced by Greenwood et al. [Reference Greenwood, Landwehr, Matalas and Wallis13]. Here, we show that extropy measures mentioned above can be expressed in terms of PWM.

The rest of this article is organized as follows. In Section 2, using a generalized cumulative residual entropy, we derive connections between some entropy measures and GMD. In Section 3, we establish some relationships between GMD and extropy measures. In Section 4, we show that several extropy measures can be brought into the framework of PWM. Finally, we make some concluding remarks in Section 5.

2. Connection between entropy and GMD

Let $X$ be a non-negative random variable with absolute continuous distribution function $F$ and let $\bar {F}(x)=P(X \gt x)$ denote the survival function. Let us further assume that the mean $\mu =E(X) \lt \infty$.

2.1. Cumulative residual entropy and GMD

We now consider the generalized cumulative residual entropy measure introduced recently by Sudheesh et al. [Reference Sudheesh, Sreedevi and Balakrishnan29].

Definition 1. Let $X$ be a non-negative random variable with absolute continuous distribution function $F$. Furthermore, let $\phi (\cdot)$ be a function of $X$ and $w(\cdot )$ be a weight function. Then, the generalized cumulative residual entropy is defined as

(1) \begin{equation} \mathcal{GE}(X)=\int_{0}^{\infty} w(u) E[\phi(X)-\phi(u)\,|\, X \gt u]\,dF(u). \end{equation}

Sudheesh et al. [Reference Sudheesh, Sreedevi and Balakrishnan29] showed that several measures of entropy available in the literature can be deduced from (1) with different choices of $w(\cdot )$ and $\phi (\cdot )$. In particular, for the choice of $w(u) = 1$ and $\phi (x) = x$, (1) reduces to the cumulative residual entropy [Reference Asadi and Zohrevand3].

We now show that GMD is a special case of $\mathcal {GE}(X)$.

Theorem 1. For the choice of $w(u)=\bar {F}(u)$ and $\phi (x)=2x$, $\mathcal {GE}(X)={\rm GMD}$.

Proof. Let $X_1$ and $X_2$ be independent and identical random variables having the distribution function $F$. With ${\rm GMD}=E|X_1-X_2|$, and because $|X_1-X_2|=\max (X_1,X_2)-\min (X_1,X_2)$, the GMD can alternatively be expressed as

$${\rm GMD}=E(\max(X_1,X_2)-\min(X_1,X_2)).$$

Now, for the choice of $w(u)=\bar {F}(u)$ and $\phi (x)=2x$, the expression in (1) becomes

\begin{align*} \mathcal{GE}(X)& =2\int_{0}^{\infty}\bar{F}(u)E(X-u\,|\,X \gt u)\,dF(u)\\ & =2\int_{0}^{\infty}\bar{F}(u)\left(\frac{1}{\bar{F}(u)}\int_{u}^{\infty}(y-u)\,dF(y)\right)dF(u)\\ & =2\int_{0}^{\infty}\int_{u}^{\infty}y \,dF(y)\,dF(u)-2\int_{0}^{\infty}u\int_{u}^{\infty}dF(y)\,dF(u) \\ & =\int_{0}^{\infty}y\cdot2F(y)\,dF(y)-\int_{0}^{\infty}y\cdot2\bar{F}(y)\,dF(y)\\ & =E(\max(X_1,X_2))-E(\min(X_1,X_2)), \end{align*}

which is precisely the GMD.

Remark 1. In general, for the choices of $w(u)=\bar {F}(u)$ and $\phi (x)=2x^v$ for some positive integer $v$, we have

$$\mathcal{GE}(X)=E(\max(X_1,X_2)^v-\min(X_1,X_2)^v).$$

Also, for $w(x)=\bar {F}^{k-1}(u)$, $k=2,3,\ldots$ and $\phi (x)=kx^v$, $v =1,2,\ldots$, we get

$$\mathcal{GE}(X)=E(\max(X_1,\ldots,X_{k})^v-\min(X_1,\ldots,X_{k})^v).$$

In the particular case of $v=1$, we have

\begin{align*} \mathcal{GE}(X)& =E(\max(X_1,\ldots,X_{k})-\min(X_1,\ldots,X_{k})) \\ & =E(\max(X_1,\ldots,X_{k}))-E(X_1)+E(X_1)-(E(\min(X_1,\ldots,X_{k})) \\ & =EG_k({-}X)-EG_k(X), \end{align*}

where $EG_k(X)$ and $EG_k(-X)$ are the risk-premium and the gain-premium of $X$ of order $k$, respectively. These two quantities have found key applications in bid and ask prices in finance; see Agouram and Lakhnati [Reference Agouram and Lakhnati2]. The above expression can alternatively be expressed as

\begin{align*} \mathcal{GE}(X)& = E(\max(X_1,\ldots,X_{k}))-E(X_1)+E(X_1)-(E(\min(X_1,\ldots,X_{k})) \\ & =k\,{\rm Cov}(X,F^{k-1}(X))-k\,{\rm Cov}(X,\bar F^{k-1}(X)). \end{align*}

In particular, when $k=2$, we obtain

$$\mathcal{GE}(X)=4{\rm Cov}(X,F(X)),$$

which is yet another representation of GMD [Reference Lerman and Yitzhaki17].

2.2. Generalized cumulative past entropy and GMD

We now consider the generalized cumulative past entropy and discuss its relationship with GMD.

Definition 2. [Reference Sudheesh, Sreedevi and Balakrishnan29]

Let $X$ be a non-negative random variable with absolute continuous distribution function $F$. Furthermore, let $\phi (\cdot )$ be a function of $X$ and $w(\cdot )$ be a weight function. Then, the generalized cumulative past entropy is defined as

(2) \begin{equation} \mathcal{GCE}(X)=\int_{0}^{\infty}w(u)E[\phi(u)-\phi(X)\,|\,X\le u]\,dF(u). \end{equation}

In particular, for the choice of $w(u)=1$ and $\phi (x)=x$, (2) reduces to $\mathcal {CE} (X)$ [Reference Di Crescenzo and Longobardi12].

Theorem 2. For the choice of $w(u)=F(u)$ and $\phi (x)=x$, $\mathcal {GCE}(X)$ reduces to the GMD.

Proof. Consider

\begin{align*} \mathcal{GE}(X)& =2\int_{0}^{\infty}{F}(u)E(u-X\,|\,X \lt u)\,dF(u)\\ & =2\int_{0}^{\infty}{F}(u)\left(\frac{1}{{F}(u)}\int_{0}^{u}(u-y)\,dF(y)\right)dF(u)\\ & =2\int_{0}^{\infty}\int_{0}^{u}u\,dF(y)\,dF(u)-2\int_{0}^{\infty}\int_{0}^{u}y\,dF(y)\,dF(u) \\ & =E(\max(X_1,X_2))-\int_{0}^{\infty}y\cdot 2\bar{F}(y)\,dF(y)\\ & =E(\max(X_1,X_2)-\min(X_1,X_2)), \end{align*}

as required.

Furthermore, for the choice of $w(u)=F^k(u)$ and $\phi (x)=(k+1)x$, where $k$ is a positive integer, we have

$$\mathcal{GCE}(X)=E(\max(X_1,\ldots,X_{k+1})-\min(X_1,\ldots,X_{k+1})).$$

We now show that GMD is a special case of the cumulative residual Tsallis entropy of order $\alpha$ defined by Rajesh and Sunoj [Reference Rajesh and Sunoj23] as

$${\rm CRT}_{\alpha}(X)=\frac{1}{\alpha-1}\int_{0}^{\infty} (\bar{F}(x)-\bar F^{\alpha}(x))\,dx, \quad \alpha \gt 0,\ \alpha\ne 1.$$

For $\alpha =2$, the above expression becomes

$${\rm CRT}_{2}(X)=\int_{0}^{\infty} (\bar{F}(x)-\bar F^{2}(x))\,dx.$$

For a non-negative random variable $X$, we have $\mu =E(X)=\int _{0}^{\infty }\bar F(x)\,dx$. Noting that $\bar F^{2}(x)$ is the survival function of $\min (X_1,X_2)$, we simply obtain

$${\rm CRT}_{2}(X)=E(X_1)-E(\min(X_1,X_2)).$$

Now, because $|X_1-X_2|=X_1+X_2-2\min (X_1,X_2)$, we obtain ${\rm GMD}=2 {\rm CRT}_{2}(X)$.

3. Connection between extropy and GMD

Tahmasebi and Toomaj [Reference Tahmasebi and Toomaj30] discussed a relationship between GMD and cumulative past extropy. Now upon using the generalized entropy measures introduced by Sudheesh et al. [Reference Sudheesh, Sreedevi and Balakrishnan29], we establish here some relationships between different dynamic extropy measures and the right- and left-truncated GMD.

First, we express the GMD in terms of cumulative residual extropy and cumulative past extropy. For a non-negative random variable $X$, Jahanshahi et al. [Reference Jahanshahi, Zarei and Khammar14] defined the cumulative residual extropy as

(3) \begin{equation} \mathcal{CRJ}(X)={-}\frac{1}{2}\int_{0}^{\infty}\bar{F}^2(x)\,dx, \end{equation}

while the cumulative past extropy is defined as Tahmasebi and Toomaj [Reference Tahmasebi and Toomaj30]

$$\mathcal{CJ}(X)={-}\frac{1}{2}\int_{0}^{\infty}(1-{F}^2(x))\,dx.$$

Various applications of $\mathcal {CRJ}(X)$ and $\mathcal {CJ}(X)$ have been discussed by Jahanshahi et al. [Reference Jahanshahi, Zarei and Khammar14] and Tahmasebi and Toomaj [Reference Tahmasebi and Toomaj30].

Using the survival functions of $\max (X_1,X_2)$ and $\min (X_1,X_2)$, we easily find that

\begin{align*} 2\mathcal{CRJ}(X)-2\mathcal{CJ}(X)& =\int_{0}^{\infty}(1-{F}^2(x))\,dx-\int_{0}^{\infty}\bar{F}^2(x)\,dx\\ & =E(\max(X_1,X_2)-\min(X_1,X_2))\\ & ={\rm GMD}. \end{align*}

Sudheesh and Sreedevi [Reference Sudheesh and Sreedevi28] discussed the non-parametric estimation of $\mathcal {CRJ}(X)$ and $\mathcal {CJ}(X)$ for right censored data, and so using the above relationship, we can readily obtain an estimator of the GMD based on right censored data.

Sathar and Nair [Reference Sathar and Nair25] defined dynamic survival extropy as

(4) \begin{equation} J_t(X)={-}\frac{1}{2\bar{F}^2(t)}\int_{t}^{\infty}\bar{F}^2(x)\,dx, \end{equation}

while Kundu [Reference Kundu15] introduced dynamic cumulative extropy as

$$H_t(X)={-}\frac{1}{2{F}^2(t)}\int_{0}^{t}{F}^2(x)\,dx.$$

Sudheesh and Sreedevi [Reference Sudheesh and Sreedevi28] proposed simple alternative expressions for different extropy measures, and then used them to establish some relationships between different dynamic and weighted extropy measures and reliability concepts.

The left-truncated GMD is given by

$${\rm GMD}_L=\frac{1}{\bar{F}^2(t)}\int_{t}^{\infty} (\bar{F}(t)-2\bar{F}(x))x\,dF(x)$$

while the right-truncated GMD is defined as

$${\rm GMD}_R=\frac{1}{{F}^2(t)}\int_{0}^{t} (2{F}(x)-{F}(t))x\,dF(x).$$

Theorem 3. Let $m(x)=E(X-x\,|\,X \gt x)$ and $r(x)=E(x-X\,|\,X\le x)$. Then, the relationships between the GMD and the dynamic survival extropy and/or the dynamic cumulative extropy are

  1. (i)

    (5) \begin{equation} {\rm GMD}_L=m(x)+2J_t(X), \end{equation}
  2. (ii)

    (6) \begin{equation} {\rm GMD}_R={-}2H_t(x)-r(x). \end{equation}

Proof. Consider the left-truncated GMD given by

(7) \begin{align} {\rm GMD}_L& =\frac{1}{\bar{F}^2(t)}\int_{t}^{\infty} (\bar{F}(t)-2\bar{F}(x))x\,dF(x)\nonumber\\ & =\frac{1}{\bar{F}^2(t)}\int_{t}^{\infty} (\bar{F}(t)-t+t-2\bar{F}(x))x\,dF(x)\nonumber\\ & =\frac{1}{\bar{F}(t)}\int_{t}^{\infty} (x-t)\,dF(x)-\frac{1}{\bar{F}^2(t)}\int_{t}^{\infty} (x-t)2\bar{F}(x)\,dF(x)\nonumber\\ & =m(t)-E(\min(X_1,X_2)-t\,|\min(X_1,X_2) \gt t). \end{align}

Sudheesh and Sreedevi [Reference Sudheesh and Sreedevi28] expressed $J_t(X)$ in (4) as

(8) \begin{equation} J_t(X)={-}\tfrac{1}{2}E(\min(X_1,X_2)-t\,|\min(X_1,X_2) \gt t). \end{equation}

Substituting (8) in (7), we obtain the relationship in (5).

Next, consider the right-truncated GMD given by

(9) \begin{equation} {\rm GMD}_R=\frac{1}{{F}^2(t)}\int_{0}^{t} (2{F}(x)-{F}(t))x\,dF(x). \end{equation}

After some algebraic manipulations, we obtain from (9) that

(10) \begin{equation} {\rm GMD}_R=E(t-\min(X_1,X_2)\,|\min(X_1,X_2)\le t)-r(x). \end{equation}

Sudheesh and Sreedevi [Reference Sudheesh and Sreedevi28] expressed $H_t(X)$ as

(11) \begin{equation} H_t(X)={-}\tfrac{1}{2}E(t-\max(X_1,X_2)\,|\max(X_1,X_2)\le t). \end{equation}

Substituting (11) in (10), we obtain the relationship in (6).

Theorem 4. Let $X_1$ and $X_2$ be two independent and identical random variables having distribution function $F$, and $Z=\min (X_1,X_2)$ be the lifetime of a series system comprising two components. Then, the generalized residual entropy associated with $Z$ is the weighted average of the difference between the left-truncated GMD and the mean residual life.

Proof. Using (1), we define the generalized residual entropy associated with $Z$ as

(12) \begin{equation} \mathcal{GE}(Z)=\int 2w(u) E[\phi(Z)-\phi(u)\,|\, Z \gt u]\bar{F}(u)\,dF(u), \end{equation}

where $\phi (\cdot )$ is a function of $Z$ and $w(\cdot )$ is a weight function. Now, for the choice of $w(u)=-\frac {1}{2}$ and $\phi (z)=z$, from (12), we obtain

$$\mathcal{GE}(Z)=\int ({\rm GMD}_L-m(u))\bar{F}(u)\,dF(u).$$

Theorem 5. Let $X_1$ and $X_2$ be two independent and identical random variables having distribution function $F$, and $Z=\max (X_1,X_2)$ be the lifetime of a parallel system comprising two components. Then, the generalized cumulative past entropy associated with $Z$ is the weighted average of the sum of the right-truncated GMD and the mean past life.

Proof. The generalized cumulative past entropy associated with $Z$ is defined as

(13) \begin{equation} \mathcal{GCE}(Z)=\int_{0}^{\infty}2w(u)E[\phi(u)-\phi(X)\,|\,X\le u]F(u)\,dF(u). \end{equation}

Again, for the choice of $w(u)=\frac {1}{2}$ and $\phi (z)=z$, from (13), we obtain

$$\mathcal{GCE}(Z)={-}\int_{0}^{\infty}({\rm GMD}_R+r(x))F(u)\,dF(u).$$

So, the generalized cumulative past entropy associated with $Z$ is the weighted average of sum of the right-truncated GMD and the mean past life.

4. Connections with PWMs

We now show that many of the measures discussed in the preceding sections can be represented in terms of PWM.

The PWM of a random variable $X$ with distribution function $F$ is defined as [Reference Greenwood, Landwehr, Matalas and Wallis13]

(14) \begin{equation} \mathcal{M}_{p,r,s}=\mathbb{E}\{ X^pF^r(X)(1-F(X))^s\}, \end{equation}

where $p$, $r$ and $s$ are any real numbers, for which the involved expectation exists.

First, we consider the GMD and related measures. From the proof of Theorem 1, we can express GMD as

$${\rm GMD}=\int_{0}^{\infty}2yF(y)\,dF(y)-\int_{0}^{\infty}2y\bar{F}(y)\,dF(y),$$

and so we have

$${\rm GMD}=2E(XF(X))-2E(X\bar{F}(X)),$$

which, when compared with (14), yields

$${\rm GMD}=2\mathcal{M}_{1,1,0}-2\mathcal{M}_{1,0,1}.$$

Several income inequality measures are derived from the GMD by choosing different weights in the expectation and one among them is the S-Gini family of indices [Reference Yitzhaki and Schechtman33]. The absolute S-Gini index is defined as

$$S_v={-}{\rm Cov}(X,\bar{F}^{v-1}(X)),\quad v \gt 0 \ \text{and}\ v\ne 1,$$

which can be rewritten as

$$S_v={-}E(X\bar{F}^{v-1}(X))+\frac{1}{v}E(X).$$

Hence, the absolute S-Gini index can be represented as

$$S_v=\frac{1}{v}\mathcal{M}_{1,0,0}-\mathcal{M}_{1,0,v-1}.$$

For $v=2$, $S_v$ reduces to GMD and hence GMD can also be represented in terms of PWM as

$${\rm GMD}=\frac{1}{v}\mathcal{M}_{1,0,0}-\mathcal{M}_{1,0,1}.$$

Next, we consider different extropy measures. The cumulative past extropy and the weighted cumulative past extropy of $X$ can be expressed as [Reference Sudheesh and Sreedevi28]

\begin{align*} \mathcal{CRJ}(X)& ={-}\tfrac{1}{2}E(\max(X_1,X_2)),\\ \mathcal{CRJW}(X)& ={-}\frac{1}{4}E(\max(X_1,X_2)^2). \end{align*}

Observing that $2F(x)f(x)$ is the density function of $\max (X_1,X_2)$, the extropy measures given above can be rewritten as

\begin{align*} \mathcal{CRJ}(X)& ={-}E(XF(X)),\\ \mathcal{CRJW}(X)& ={-}\tfrac{1}{2}E(X^2F(X)), \end{align*}

so that

$$\mathcal{CRJ}(X)={-}\mathcal{M}_{1,1,0}\quad \text{and}\quad \mathcal{CRJW}(X)={-}\tfrac{1}{2}\mathcal{M}_{2,1,0}.$$

The cumulative residual entropy and the weighted cumulative residual entropy of $X$ can be expressed as [Reference Sudheesh and Sreedevi28]

\begin{align*} \mathcal{CJ}(X)& ={-}\tfrac{1}{2}E(\min(X_1,X_2)),\\ \mathcal{WCJ}(X)& ={-}\tfrac{1}{4}E(\min(X_1,X_2)^2). \end{align*}

Here again, we observe that $2\bar {F}(x)f(x)$ is the density function of $\min (X_1,X_2)$, and so

\begin{align*} \mathcal{CJ}(X)& ={-}E(X\bar{F}(X))\\ \mathcal{WCJ}(X)& ={-}\tfrac{1}{2}E(X^2\bar{F}(X)), \end{align*}

from which we readily obtain

$$\mathcal{CJ}(X)={-}\mathcal{M}_{1,0,1}\quad \text{and}\quad \mathcal{WCJ}(X)={-}\tfrac{1}{2}\mathcal{M}_{2,0,1}.$$

Next, we show that the cumulative residual Tsallis entropy of order $\alpha$ can be written in terms of PWM. Let us assume that $\alpha$ is a positive integer with $\alpha \ne 1$. Recall that

(15) \begin{equation} {\rm CRT}_{\alpha}(X)=\frac{1}{\alpha-1}\int_{0}^{\infty} (\bar{F}(x)-\bar F^{\alpha}(x))\,dx. \end{equation}

We can show that

$$\int_{0}^{\infty}x \alpha \bar F^{\alpha-1}(x)\,dF(x) =\int_{0}^{\infty}\bar F^{\alpha}(x)\,dx,$$

and so we can write from (15) that

$${\rm CRT}_{\alpha}(X)=\frac{1}{\alpha-1}E(X)-\frac{\alpha}{\alpha-1}E(X\bar F^{\alpha}(X)),$$

which yields

$${\rm CRT}_{\alpha}(X)=\frac{1}{\alpha-1} \mathcal{M}_{1,0,0}-\frac{\alpha}{\alpha-1}\mathcal{M}_{1,0,(\alpha-1)}.$$

For a non-negative continuous random variable $X$, the weighted cumulative residual Tsallis entropy (WCRTE) of order $\alpha$ is defined as [Reference Chakraborty and Pradhan7]

$${\rm WCRT}_{\alpha}(X)=\frac{1}{\alpha-1}\int_{0}^{\infty}x(\bar{F}(x)-\bar{F}^{\alpha}(x))\,dx.$$

After some simple algebra, we can show that

$$\int_{0}^{\infty}x^2 \alpha \bar F^{\alpha-1}(x)\,dF(x) =\int_{0}^{\infty}2x\bar F^{\alpha}(x)\,dx,$$

so that

$${\rm WCRT}_{\alpha}(X)=\frac{1}{2(\alpha-1)}E(X^2)-\frac{\alpha}{2(\alpha-1)}E(X^2\bar{F}^{\alpha}(X))$$

from which we readily obtain

$${\rm WCRT}_{\alpha}(X)=\frac{1}{2(\alpha-1)} \mathcal{M}_{2,0,0}-\frac{\alpha}{2(\alpha-1)}\mathcal{M}_{2,0,(\alpha-1)}.$$

Calì et al. [Reference Calì, Longobardi and Ahmadi6] introduced the cumulative past Tsallis entropy of $X$ as

$$CT_{\alpha}(X)=\frac{1}{\alpha-1}\int_{0}^{\infty} ({F}(x)- F^{\alpha}(x))\,dx,\quad \alpha \gt 0, \ \alpha\ne1,$$

while Chakraborty and Pradhan [Reference Chakraborty and Pradhan8] defined the weighted cumulative past Tsallis entropy (WCTE) of order $\alpha$ as

$$WCT_{\alpha}(X)=\frac{1}{\alpha-1}\int_{0}^{\infty}x(F(x)-F^{\alpha}(x))\,dx,\quad \alpha \gt 0,\ \alpha\ne1.$$

As in the case of CRTE and WCRTE, we obtain

\begin{align*} CT_{\alpha}(X)& =\frac{1}{(\alpha-1)} \mathcal{M}_{1,0,0}-\frac{\alpha}{(\alpha-1)}\mathcal{M}_{1,(\alpha-1),0},\\ {\rm WCRT}_{\alpha}(X)& =\frac{1}{2(\alpha-1)} \mathcal{M}_{2,0,0}-\frac{\alpha}{2(\alpha-1)}\mathcal{M}_{2,(\alpha-1),0}. \end{align*}

Aaberge [Reference Aaberge1] introduced a family of inequality measures based on the moments of the Lorenz curve as

$$D_k(F)=\frac{1}{k\mu}\int_{0}^{\infty}F(x)(1-F^k(x))\,dx,\quad k=1,2,\ldots.$$

We note that $D_k(F)$ can be expressed in term of cumulative past Tsallis entropy as $CT_{k+1}(X)/\mu$. Hence, we can express

$$D_k(F)=\frac{1}{k\mu} \mathcal{M}_{1,0,0}-\frac{k+1}{k\mu}\mathcal{M}_{1,k,0}.$$

Sharma and Taneja [Reference Sharma and Taneja27] and Mittal [Reference Mittal20] independently introduced the STM entropy of $X$ as

(16) \begin{equation} S_{\alpha,\beta}=\frac{1}{\beta-\alpha}\int_{0}^{\infty}(f^{\alpha}(x)-f^{\beta}(x))\,dx. \end{equation}

Some entropy measures discussed in the literature can be derived from (16) by making different choices for $\alpha$ and $\beta$.

Sudheesh et al. [Reference Sudheesh, Sreedevi and Balakrishnan29] introduced a cumulative residual STM entropy of $X$ as

$$SR_{\alpha,\beta}=\frac{1}{\beta-\alpha}\int_{0}^{\infty}(\bar F^{\alpha}(x)-\bar F^{\beta}(x))\,dx,$$

and the cumulative past STM entropy of $X$ as

$$SP_{\alpha,\beta}=\frac{1}{\beta-\alpha}\int_{0}^{\infty}(F^{\alpha}(x)-F^{\beta}(x))\,dx.$$

They also introduced weighted versions of these measures as

$$SRW_{\alpha,\beta}=\frac{1}{\beta-\alpha}\int_{0}^{\infty}x(\bar F^{\alpha}(x)-\bar F^{\beta}(x))\,dx$$

and

$$SPW_{\alpha,\beta}=\frac{1}{\beta-\alpha}\int_{0}^{\infty}x(F^{\alpha}(x)-F^{\beta}(x))\,dx.$$

Sudheesh et al. [Reference Sudheesh, Sreedevi and Balakrishnan29] showed that $SR_{\alpha,\beta }$ and $SRW_{\alpha,\beta }$ are special cases of (1), while $SP_{\alpha,\beta }$ and $SPW_{\alpha,\beta }$ are special cases of (2).

We now express these measures in terms of PWM as follows, presented without proofs, for the sake of conciseness:

\begin{align*} SR_{\alpha,\beta}& =\frac{1}{\beta-\alpha}(\mathcal{M}_{1,0,(\alpha-1)}-\mathcal{M}_{1,0,(\beta-1)}),\\ SP_{\alpha,\beta}& =\frac{1}{\beta-\alpha}(\mathcal{M}_{1,(\alpha-1),0}-\mathcal{M}_{1,(\beta-1),0}),\\ SRW_{\alpha,\beta}& =\frac{1}{2(\beta-\alpha)}(\mathcal{M}_{2,0,(\alpha-1)}-\mathcal{M}_{2,0,(\beta-1)}),\\ SPW_{\alpha,\beta}& =\frac{1}{2(\beta-\alpha)}(\mathcal{M}_{2,(\alpha-1),0}-\mathcal{M}_{2,(\beta-1),0}). \end{align*}

These representations in terms of PWM would enable us to develop inferential methods for all these measures using the known results on PWM.

5. Concluding remarks

The GMD is a well-known measure of dispersion that is used extensively in economics. We have established here some relationships between the information measures and the GMD. We have shown that the GMD is a special case of the generalized cumulative residual entropy proposed recently by Sudheesh et al. [Reference Sudheesh, Sreedevi and Balakrishnan29]. We have also established some relationships between the GMD and extropy measures. We have further presented some relationships between the dynamic versions of the cumulative residual/past extropy and the truncated GMD.

The PWMs generalize the concept of moments of a probability distribution. The estimates derived using PWMs are often superior than the standard moment-based estimates. For discussion on different methods of estimation of the PWM, we refer to David and Nagaraja [Reference David and Nagaraja10], Vexler et al. [Reference Vexler, Zou and Hutson32] and Deepesh et al. [Reference Deepesh, Sudheesh and Sreelakshmi11] and the references therein. We have shown here that many of the information measures can be expressed in terms of PWM, which would facilitate the development of inferential methods for these measures using the well-known results on PWM.

Acknowledgments

Our Sincere thanks go to the Editor-in-Chief and the anonymous reviewers for their useful comments and suggestions on our earlier version of this manuscript which led to this improved version.

References

Aaberge, R. (2007). Gini's nuclear family. The Journal of Economic Inequality 5(3): 305322.CrossRefGoogle Scholar
Agouram, J. & Lakhnati, G. (2016). Mean-Gini and mean-extended Gini portfolio selection: An empirical analysis. Risk Governance and Control: Financial Markets & Institutions 6(3-1): 5966.Google Scholar
Asadi, M. & Zohrevand, Y. (2007). On the dynamic cumulative residual entropy. Journal of Statistical Planning and Inference 137(6): 19311941.CrossRefGoogle Scholar
Balakrishnan, N., Buono, F., & Longobardi, M. (2022). On weighted extropies. Communications in Statistics - Theory and Methods 51(18): 62506267.CrossRefGoogle Scholar
Behdani, Z., Mohtashami Borzadaran, G.R., & Sadeghpour Gildeh, B. (2020). Some properties of double truncated distributions and their application in view of income inequality. Computational Statistics 35(1): 359378.CrossRefGoogle Scholar
Calì, C., Longobardi, M., & Ahmadi, J. (2017). Some properties of cumulative Tsallis entropy. Physica A: Statistical Mechanics and its Applications 486: 10121021.CrossRefGoogle Scholar
Chakraborty, S. & Pradhan, B. (2021a). On weighted cumulative Tsallis residual and past entropy measures. Communications in Statistics - Simulation and Computation 115. doi:10.1080/03610918.2021.1897623Google Scholar
Chakraborty, S. & Pradhan, B. (2021b). Generalized weighted survival and failure entropies and their dynamic versions. Communications in Statistics - Theory and Methods 121. doi:10.1080/03610926.2021.1921803Google Scholar
Chakraborty, S. & Pradhan, B. (2022). Some properties of weighted survival extropy and its extended measures. Communications in Statistics - Theory and Methods 124. doi:10.1080/03610926.2022.2076118Google Scholar
David, H.A. & Nagaraja, H.N. (2003). Order statistics, 3rd ed. New Jersey: John Wiley and Sons, Inc.CrossRefGoogle Scholar
Deepesh, B., Sudheesh, K.K., & Sreelakshmi, N. (2021). Jackknife empirical likelihood inference for probability weighted moments. Journal of the Korean Statistical Society 50(1): 98116.Google Scholar
Di Crescenzo, A. & Longobardi, M. (2009). On cumulative entropies. Journal of Statistical Planning and Inference 139(12): 40724087.CrossRefGoogle Scholar
Greenwood, J.A., Landwehr, J.M., Matalas, N.C., & Wallis, J.R. (1979). Probability weighted moments: Definition and relation to parameters of several distributions expressable in inverse form. Water Resource Research 15(5): 10551064.CrossRefGoogle Scholar
Jahanshahi, S.M.A., Zarei, H., & Khammar, A.H. (2020). On cumulative residual extropy. Probability in the Engineering and Informational Sciences 34(4): 605625.CrossRefGoogle Scholar
Kundu, C. (2021). On cumulative residual (past) extropy of extreme order statistics. Communications in Statistics - Theory and Methods 118. doi:10.1080/03610926.2021.2021238Google Scholar
Lad, F., Sanfilippo, G., & Agro, G. (2015). Extropy: Complementary dual of entropy. Statistical Science 30(1): 4058.CrossRefGoogle Scholar
Lerman, R.I. & Yitzhaki, S. (1984). A note on the calculation and interpretation of the Gini index. Economics Letters 15(3–4): 363368.CrossRefGoogle Scholar
Mirali, M. & Baratpour, S. (2017). Some results on weighted cumulative entropy. Journal of the Iranian Statistical Society 16(2): 2132.Google Scholar
Mirali, M., Baratpour, S., & Fakoor, V. (2016). On weighted cumulative residual entropy. Communications in Statistics - Theory and Methods 46(6): 28572869.CrossRefGoogle Scholar
Mittal, D.P. (1975). On some functional equations concerning entropy, directed divergence and inaccuracy. Metrika 22(1): 3545.CrossRefGoogle Scholar
Nair, N.U., Sankaran, P.G., & Vineshkumar, B. (2012). Characterization of distributions by properties of truncated Gini index and mean difference. Metron 70(2): 173191.CrossRefGoogle Scholar
Nair, N.U. & Vineshkumar, B. (2022). Cumulative entropy and income analysis. Stochastics and Quality Control 115. doi:10.1515/eqc-2022-0012Google Scholar
Rajesh, G. & Sunoj, S.M. (2019). Some properties of cumulative Tsallis entropy of order $\alpha$. Statistical Papers 60(3): 933943.CrossRefGoogle Scholar
Rao, M., Chen, Y., Vemuri, B., & Wang, F. (2004). Cumulative residual entropy: A new measure of information. IEEE Transactions on Information Theory 50(6): 12201228.CrossRefGoogle Scholar
Sathar, E.A. & Nair, R.D. (2021). On dynamic survival extropy. Communications in Statistics - Theory and Methods 50(6): 12951313.CrossRefGoogle Scholar
Shannon, C.E. (1948). A mathematical theory of communication. The Bell System Technical Journal 27(3): 379423.CrossRefGoogle Scholar
Sharma, B.D. & Taneja, I.J. (1975). Entropy of type $(\alpha, \beta )$ and other generalized measures in information theory. Metrika 22(1): 205215.CrossRefGoogle Scholar
Sudheesh, K.K. & Sreedevi, E.P. (2022). Non-parametric estimation of cumulative (residual) extropy. Statistics & Probability Letters 185: 109434.Google Scholar
Sudheesh, K.K., Sreedevi, E.P., & Balakrishnan, N. (2022). A generalized measure of cumulative residual entropy. Entropy 24(4): 444.Google Scholar
Tahmasebi, S. & Toomaj, A. (2022). On negative cumulative extropy with applications. Communications in Statistics - Theory and Methods 51(15): 50255047.CrossRefGoogle Scholar
Tsallis, C. (1988). Possible generalization of Boltzmann-Gibbs statistics. Journal of Statistical Physics 52(1): 479487.CrossRefGoogle Scholar
Vexler, A., Zou, L., & Hutson, A.D. (2017). An extension to empirical likelihood for evaluating probability weighted moments. Journal of Statistical Planning and Inference 182: 5060.CrossRefGoogle Scholar
Yitzhaki, S. & Schechtman, E (2013). The Gini methodology: A primer on a statistical methodology. New York: Springer.CrossRefGoogle Scholar