Abstract
In 1994, Jim Massey proposed the guessing entropy as a measure of the difficulty that an attacker has to guess a secret used in a cryptographic system, and established a well-known inequality between entropy and guessing entropy. Over 15 years before, in an unpublished work, he also established a well-known inequality for the entropy of an integer-valued random variable of given variance. In this paper, we establish a link between the two works by Massey in the more general framework of the relationship between discrete (absolute) entropy and continuous (differential) entropy. Two approaches are given in which the discrete entropy (or Rényi entropy) of an integer-valued variable can be upper bounded using the differential (Rényi) entropy of some suitably chosen continuous random variable. As an application, lower bounds on guessing entropy and guessing moments are derived in terms of entropy or Rényi entropy (without side information) and conditional entropy or Arimoto conditional entropy (when side information is available).
| Original language | English |
|---|---|
| Pages (from-to) | 2813-2828 |
| Number of pages | 16 |
| Journal | IEEE Transactions on Information Theory |
| Volume | 68 |
| Issue number | 5 |
| DOIs | |
| Publication status | Published - 1 May 2022 |
Keywords
- Arikana's inequality
- Kullbacka's inequality
- Masseya's inequality
- Poisson summation formula
- Rényi entropies
- Rényi-Arimoto conditional entropies
- discrete vs differential entropies
- generalized Gaussian densities
- generalized exponential densities
- guessing entropy
- guessing moments
- guessing with side information
Fingerprint
Dive into the research topics of 'Variations on a Theme by Massey'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver