Uniform Distribution
Among probability distributions which are nonzero over a finite range of values , the maximum entropy distribution is the uniform distribution.
To show this, we must maximize the entropy,
with respect to , subject to the constraints
Using the method of Lagrange multipliers for optimization in the presence of constraints, we may form the objective function
and differentiate with respect to to obtain
Setting this to zero and solving for gives
(Setting the partial derivative with respect to to zero merely restates the constraint.)
Choosing to satisfy the constraint gives , yielding
That this solution is a maximum rather than a minimum or inflection point can be verified by ensuring the sign of the second partial derivative is negative for all :
Since the solution spontaneously satisfied , it is a maximum.
Exponential Distribution
Among probability distributions which are nonzero over a semi-infinite range of values and having a finite mean , the exponential distribution has maximum entropy.
To the previous case, we add the new constraint
resulting in the objective function
Now the partials with respect to are
and is of the form . The unit-area and finite-mean constraints result in and , yielding
The Gaussian distribution has maximum entropy relative to all probability distributions covering the entire real line but having a finite mean and finite variance .
Proceeding as before, we obtain the objective function
and partial derivatives
leading to
For more on entropy and maximum-entropy distributions, see (Cover and Thomas 1991).