Acharya, Jayadev and Orlitsky, Alon and Suresh, Ananda Theertha and Tyagi, Himanshu (2017) Estimating Renyi Entropy of Discrete Distributions. In: IEEE Transactions on Information Theory, 63 (1). pp. 38-56. ISSN 0018-9448
|
PDF
IEEE_tra_inf_the_63-1_38-56_2017.pdf - Published Version Download (871kB) | Preview |
Abstract
It was shown recently that estimating the Shannon entropy H( p) of a discrete k-symbol distribution p requires (k/ log k) samples, a number that grows near-linearly in the support size. In many applications, H( p) can be replaced by the more general Rényi entropy of order α and Hα( p). We determine the number of samples needed to estimate Hα( p) for all α, showing that α < 1 requires a super-linear, roughly k1/α samples, noninteger α > 1 requires a near-linear k samples, but, perhaps surprisingly, integer α > 1 requires only (k1-1/α) samples. Furthermore, developing on a recently established connection between polynomial approximation and estimation of additive functions of the form x f ( px), we reduce the sample complexity for noninteger values of α by a factor of log k compared with the empirical estimator. The estimators achieving these bounds are simple and run in time linear in the number of samples. Our lower bounds provide explicit constructions of distributions with different Rényi entropies that are hard to distinguish.
Item Type: | Journal Article |
---|---|
Publication: | IEEE Transactions on Information Theory |
Publisher: | Institute of Electrical and Electronics Engineers Inc. |
Additional Information: | The copyright for this article belongs to the Authors. |
Keywords: | Entropy estimation; minimax lower bounds; sample complexity; sublinear algorithms |
Department/Centre: | Division of Electrical Sciences > Electrical Communication Engineering |
Date Deposited: | 14 Jun 2022 04:55 |
Last Modified: | 14 Jun 2022 04:55 |
URI: | https://eprints.iisc.ac.in/id/eprint/73413 |
Actions (login required)
View Item |