I want to study here the Rényi entropy, using Python. I will define a function implementing $H_{\alpha}(X)$, from the given formula, for discrete random variables, and check the influence of the parameter $\alpha$, $$ H_{\alpha}(X) := \frac{1}{1-\alpha} \log_2(\sum_i^n p_i^{\alpha}),$$ where $X$ has $n$ possible values, and the $i$-th outcome has probability $p_i\in[0,1]$.
!pip install watermark matplotlib numpy
%load_ext watermark
%watermark -v -m -a "Lilian Besson" -g -p matplotlib,numpy
import numpy as np
import matplotlib.pyplot as plt
We start by giving three examples of such vectors $X=(p_i)_{1\leq i \leq n}$, a discrete probability distributions on $n$ values.
X1 = [0.25, 0.5, 0.25]
X2 = [0.1, 0.25, 0.3, 0.45]
X3 = [0, 0.5, 0.5]
X4 = np.full(100, 1/100)
X5 = np.full(1000, 1/1000)
X6 = np.arange(100, dtype=float)
X6 /= np.sum(X6)
We need a function to safely compute $x \mapsto x \log_2(x)$, with special care in case $x=0$. This one will accept a numpy array or a single value as argument:
np.seterr(all="ignore")
def x_log2_x(x):
""" Return x * log2(x) and 0 if x is 0."""
results = x * np.log2(x)
if np.size(x) == 1:
if np.isclose(x, 0.0):
results = 0.0
else:
results[np.isclose(x, 0.0)] = 0.0
return results
For examples:
x_log2_x(0)
x_log2_x(0.5)
x_log2_x(1)
x_log2_x(2)
x_log2_x(10)
and with vectors, slots with $p_i=0$ are handled without error:
x_log2_x(X1)
x_log2_x(X2)
x_log2_x(X3)
x_log2_x(X4)[:10]
x_log2_x(X5)[:10]
x_log2_x(X6)[:10]
From the mathematical definition, an issue will happen if $\alpha=1$ or $\alpha=\inf$, so we deal with the special cases manually. $X$ is here given as the vector of $(p_i)_{1\leq i \leq n}$.
def renyi_entropy(alpha, X):
assert alpha >= 0, "Error: renyi_entropy only accepts values of alpha >= 0, but alpha = {}.".format(alpha) # DEBUG
if np.isinf(alpha):
# XXX Min entropy!
return - np.log2(np.max(X))
elif np.isclose(alpha, 0):
# XXX Max entropy!
return np.log2(len(X))
elif np.isclose(alpha, 1):
# XXX Shannon entropy!
return - np.sum(x_log2_x(X))
else:
return (1.0 / (1.0 - alpha)) * np.log2(np.sum(X ** alpha))
# Curryfied version
def renyi_entropy_2(alpha):
def re(X):
return renyi_entropy(alpha, X)
return re
# Curryfied version
def renyi_entropy_3(alphas, X):
res = np.zeros_like(alphas)
for i, alpha in enumerate(alphas):
res[i] = renyi_entropy(alpha, X)
return res
alphas = np.linspace(0, 10, 1000)
renyi_entropy_3(alphas, X1)[:10]
def plot_renyi_entropy(alphas, X):
fig = plt.figure()
plt.plot(alphas, renyi_entropy_3(alphas, X))
plt.xlabel(r"Value for $\alpha$")
plt.ylabel(r"Value for $H_{\alpha}(X)$")
plt.title(r"Réniy entropy for $X={}$".format(X[:10]))
plt.show()
# return fig
plot_renyi_entropy(alphas, X1)
plot_renyi_entropy(alphas, X2)
plot_renyi_entropy(alphas, X3)
plot_renyi_entropy(alphas, X4)
plot_renyi_entropy(alphas, X5)
plot_renyi_entropy(alphas, X6)
It is not surprising that $H_{\alpha}(X)$ appears to be continuous as a function of $\alpha$, as one can easily verify that it is.