What is Randomness? The Interplay between Alpha Entropies, Total Variation and Guessing


Venue/Location: C101, VIASM

Speaker: Olivier Rioul


In many areas of computer science, it is of primary importance to assess the randomness of a certain variable. Many different criteria can be used to evaluate randomness, possibly after observing some disclosed data. A “sufficiently random” variable is often described as “entropic”. Indeed, Shannon’s entropy is known to provide a resistance criterion against modeling attacks. More generally one may consider the Rényi α-entropy where Shannon’s entropy, collision entropy and min-entropy are recovered as particular cases α = 1, 2 and +∞, respectively. Guess work or guessing entropy is also of great interest in relation to α-entropy.

On the other hand, many applications rely instead on the “statistical distance”, a.k.a. total variation distance to the uniform distribution. This criterion is particularly important because a very small distance ensures that no statistical test can effectively distinguish between the actual distribution and the uniform distribution.
We establish optimal lower and upper bounds between α-entropy, guessing entropy on one hand, and error probability and total variation distance to the uniform on the other. In this context, it turns out that the best known “Pinsker inequality” and recent “reverse Pinsker inequalities” are not necessarily optimal. We recover or improve previous Fano-type and Pinsker-type inequalities used for several applications.

Bio:  Olivier Rioul is full Professor at the Department of Communication and Electronics at Télécom Paris, Institut Polytechnique de Paris, France. He graduated from École Polytechnique, Paris, France in 1987 and from École Nationale Supérieure des Télécommunications, Paris, France in 1989. He obtained his PhD degree from École Nationale Supérieure des Télécommunications, Paris, France in 1993. His research interests are in applied mathematics and include various, sometimes unconventional, applications of information theory such as inequalities in statistics, hardware security, and experimental psychology. He has been teaching information theory at various universities for twenty years and has published a textbook which has become a classical French reference in the field.