A new generalized varentropy and its properties. (English) Zbl 1450.62006

Summary: The variance of Shannon information related to the random variable \(X\), which is called varentropy, is a measurement that indicates, how the information content of \(X\) is scattered around its entropy and explains its various applications in information theory, computer sciences, and statistics. In this paper, we introduce a new generalized varentropy based on the Tsallis entropy and also obtain some results and bounds for it. We compare the varentropy with the Tsallis varentropy. Moreover, we explain the Tsallis varentropy of the order statistics and analyse this concept in residual (past) lifetime distributions and then introduce two new classes of distributions by them.


62B10 Statistical aspects of information-theoretic topics
94A17 Measures of information, entropy
62E15 Exact distribution theory in statistics
60E05 Probability distributions: general theory
Full Text: DOI MNR


[1] Abbasnejad M., Arghami N. R., “Renyi entropy properties of order statistics”, Comm. Statist. Theory Methods, 40:1 (2010), 40-52 · Zbl 1208.62005
[2] Afhami B., Madadi M., Rezapour M., “Goodness-of-fit test based on Shannon entropy of \(k\)-record values from the generalized”, J. Stat. Sci., 9:1 (2015), 43-60
[3] Arikan E., “Varentropy decreases under the polar transform”, IEEE Trans. Inform. Theory, 62:6 (2016), 3390-3400 · Zbl 1359.94292
[4] Arnold B. C., Balakrishnan N., Nagaraja H. N., A First Course in Order Statistics, Classics Appl. Math., 54, SIAM, Philadelphia, 2008, 279 pp. · Zbl 1172.62017
[5] Baratpour S., Ahmadi J., Arghami N. R., “Characterizations based on Rényi entropy of order statistics and record values”, J. Statist. Plann. Inference, 138:8 (2008), 2544-2551 · Zbl 1173.62040
[6] Baratpour S., Khammar A., “Tsallis entropy properties of order statistics and some stochastic comparisons”, J. Statist. Res. Iran, 13:1 (2016), 25-41
[7] Bobkov S., Madiman M., “Concentration of the information in data with log-concave distributions”, Ann. Probab., 39:4 (2011), 1528-1543 · Zbl 1227.60043
[8] David H. A., Nagaraja H. N., Order Statistics, Wiley Ser. Probab. Stat. Hoboken, 3rd, John Wiley & Sons, Inc., Hoboken, New Jersey, 2003, 458 pp. · Zbl 1053.62060
[9] Di Crescenzo A., Longobardi M., “Statistic comparisons of cumulative entropies”, Stochastic Orders in Reliability and Risk, v. 208, eds. H. Li, X. Li., Springer, New York, 2013, 167-182 · Zbl 1312.62011
[10] Di Crescenzo A., Paolillo L., “Analysis and applications of the residual varentropy of random lifetimes”, Probab. Engrg. Inform. Sci., 2020, 1-19
[11] Ebrahimi N., Kirmani S. N. U. A., “Some results on ordering of survival functions through uncertainty”, Statist. Probab. Lett., 29:2 (1996), 167-176 · Zbl 1007.62527
[12] Ebrahimi N., Soofi E. S., Zahedi H., “Information properties of order statistics and spacing”, IEEE Trans. Inform. Theory, 50:1 (2004), 177-183 · Zbl 1296.94055
[13] Enomoto R., Okamoto N., Seo T., “On the asymptotic normality of test statistics using Song‘s kurtosis”, J. Stat. Theory Pract., 7:1 (2013), 102-119 · Zbl 1425.62015
[14] Gupta R. C., Taneja H. C., Thapliyal R., “Stochastic comparisons based on residual entropy of order statistics and some characterization results”, J. Stat. Theory Appl., 13:1 (2014), 27-37
[15] Kontoyiannis I., Verdú S., “Optimal lossless compression: Source varentropy and dispersion”, IEEE Trans. Inform. Theory, 60:2 (2014), 777-795 · Zbl 1364.94132
[16] Liu J., Information Theoretic Content and Probability, Ph.D. Thesis, University of Florida, 2007
[17] Nanda A. K., Paul P., “Some results on generalized residual entropy”, Inform. Sci., 176:1 (2006), 27-47 · Zbl 1093.94016
[18] Park S., “The entropy of consecutive order statistics”, IEEE Trans. Inform. Theory, 41:6 (1995), 2003-2007 · Zbl 0852.62007
[19] Psarrakos G., Navarro J., “Generalized cumulative residual entropy and record values”, Metrika, 76 (2013), 623-640 · Zbl 1307.62011
[20] Raqab M.Ż., Amin W. A., “Some ordering result on order statistics and record values”, IAPQR Trans., 21:1 (1996), 1-8 · Zbl 0899.62070
[21] Shannon C. E., “A mathematical theory of communication”, Bell System Technical J., 27:3 (1948), 379-423 · Zbl 1154.94303
[22] Song K.-S., “Rényi information, log likelihood and an intrinsic distribution measure”, J. Statist. Plann. Inference, 93:1-2 (2001), 51-69 · Zbl 0997.62003
[23] Tsallis C., “Possible generalization of Boltzmann-Gibbs statistics”, J. Stat. Phys., 52 (1988), 479-487 · Zbl 1082.82501
[24] Vikas Kumar, Taneja H. C., “A generalized entropy-based residual lifetime distributions”, Int. J. Biomath., 04:02 (2011), 171-148 · Zbl 1297.62015
[25] Wilk G., Włodarczyk Z., “Example of a possible interpretation of Tsallis entropy”, Phys. A: Stat. Mech. Appl., 387:19-20 (2008), 4809-4813
[26] Wong K. M., Chen S., “The entropy of ordered sequences and order statistics”, IEEE Trans. Inform. Theory, 36:2 (1990), 276-284 · Zbl 0699.62004
[27] Zarezadeh S., Asadi M., “Results on residual Rényi entropy of order statistics and record values”, Inform. Sci., 180:21 (2010), 4195-4206 · Zbl 1204.94054
[28] Zhang Z., “Uniform estimates on the Tsallis entropies”, Lett. Math. Phys., 80 (2007), 171-181 · Zbl 1208.94035
[29] Zografos K., “On Mardia‘s and Song‘s measures of kurtosis in elliptical distributions”, J. Multivariate Anal., 99:5 (2008), 858-879 · Zbl 1133.62329
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.