Documentation

Mathlib.Analysis.SpecialFunctions.BinaryEntropy

Properties of Shannon q-ary entropy and binary entropy functions #

The binary entropy function binEntropy p := - p * log p - (1 - p) * log (1 - p) is the Shannon entropy of a Bernoulli random variable with success probability p.

More generally, the q-ary entropy function is the Shannon entropy of the random variable with possible outcomes {1, ..., q}, where outcome 1 has probability 1 - p and all other outcomes are equally likely.

qaryEntropy (q : โ„•) (p : โ„) := p * log (q - 1) - p * log p - (1 - p) * log (1 - p)

This file assumes that entropy is measured in Nats, hence the use of natural logarithms. Most lemmas are also valid using a logarithm in a different base.

Main declarations #

Main results #

The functions are also defined outside the interval Icc 0 1 due to log x = log |x|.

Tags #

entropy, Shannon, binary, nit, nepit

Binary entropy #

noncomputable def Real.binEntropy (p : โ„) :

The binary entropy function binEntropy p := - p * log p - (1-p) * log (1 - p) is the Shannon entropy of a Bernoulli random variable with success probability p.

Equations
    Instances For
      @[simp]

      binEntropy is symmetric about 1/2.

      theorem Real.binEntropy_pos {p : โ„} (hpโ‚€ : 0 < p) (hpโ‚ : p < 1) :
      theorem Real.binEntropy_nonneg {p : โ„} (hpโ‚€ : 0 โ‰ค p) (hpโ‚ : p โ‰ค 1) :
      theorem Real.binEntropy_neg_of_neg {p : โ„} (hp : p < 0) :

      Outside the usual range of binEntropy, it is negative. This is due to log p = log |p|.

      Outside the usual range of binEntropy, it is negative. This is due to log p = log |p|.

      theorem Real.binEntropy_neg_of_one_lt {p : โ„} (hp : 1 < p) :

      Outside the usual range of binEntropy, it is negative. This is due to log p = log |p|

      Outside the usual range of binEntropy, it is negative. This is due to log p = log |p|

      For probability p โ‰  0.5, binEntropy p < log 2.

      Binary entropy is continuous everywhere. This is due to definition of Real.log for negative numbers.

      Binary entropy has derivative log (1 - p) - log p. It's not differentiable at 0 or 1 but the junk values of deriv and log coincide there.

      q-ary entropy #

      noncomputable def Real.qaryEntropy (q : โ„•) (p : โ„) :

      Shannon q-ary Entropy function (measured in Nats, i.e., using natural logs).

      It's the Shannon entropy of a random variable with possible outcomes {1, ..., q} where outcome 1 has probability 1 - p and all other outcomes are equally likely.

      The usual domain of definition is p โˆˆ [0,1], i.e., input is a probability.

      This is a generalization of the binary entropy function binEntropy.

      Equations
        Instances For
          @[simp]
          theorem Real.qaryEntropy_one (q : โ„•) :
          qaryEntropy q 1 = log โ†‘(โ†‘q - 1)
          theorem Real.qaryEntropy_pos {q : โ„•} {p : โ„} (hpโ‚€ : 0 < p) (hpโ‚ : p < 1) :
          theorem Real.qaryEntropy_nonneg {q : โ„•} {p : โ„} (hpโ‚€ : 0 โ‰ค p) (hpโ‚ : p โ‰ค 1) :
          theorem Real.qaryEntropy_neg_of_neg {q : โ„•} {p : โ„} (hp : p < 0) :

          Outside the usual range of qaryEntropy, it is negative. This is due to log p = log |p|.

          Outside the usual range of qaryEntropy, it is negative. This is due to log p = log |p|.

          The q-ary entropy function is continuous everywhere. This is due to definition of Real.log for negative numbers.

          theorem Real.deriv_qaryEntropy {q : โ„•} {p : โ„} (hpโ‚€ : p โ‰  0) (hpโ‚ : p โ‰  1) :
          deriv (qaryEntropy q) p = log (โ†‘q - 1) + log (1 - p) - log p
          theorem Real.hasDerivAt_binEntropy {p : โ„} (hpโ‚€ : p โ‰  0) (hpโ‚ : p โ‰  1) :

          Binary entropy has derivative log (1 - p) - log p.

          theorem Real.hasDerivAt_qaryEntropy {q : โ„•} {p : โ„} (hpโ‚€ : p โ‰  0) (hpโ‚ : p โ‰  1) :
          HasDerivAt (qaryEntropy q) (log (โ†‘q - 1) + log (1 - p) - log p) p
          theorem Real.deriv2_qaryEntropy {q : โ„•} {p : โ„} :
          deriv^[2] (qaryEntropy q) p = -1 / (p * (1 - p))

          Second derivative of q-ary entropy.

          Strict monotonicity of entropy #

          theorem Real.qaryEntropy_strictMonoOn {q : โ„•} (qLe2 : 2 โ‰ค q) :
          StrictMonoOn (qaryEntropy q) (Set.Icc 0 (1 - 1 / โ†‘q))

          Qary entropy is strictly increasing in the interval [0, 1 - qโปยน].

          theorem Real.qaryEntropy_strictAntiOn {q : โ„•} (qLe2 : 2 โ‰ค q) :
          StrictAntiOn (qaryEntropy q) (Set.Icc (1 - 1 / โ†‘q) 1)

          Qary entropy is strictly decreasing in the interval [1 - qโปยน, 1].

          Binary entropy is strictly increasing in interval [0, 1/2].

          Binary entropy is strictly decreasing in interval [1/2, 1].

          Strict concavity of entropy #