Next |
Prev |
Up |
Top
|
Index |
JOS Index |
JOS Pubs |
JOS Home |
Search
Among probability distributions
which are nonzero over a
finite range of values
, the maximum-entropy
distribution is the uniform distribution. To show this, we
must maximize the entropy,
![$\displaystyle H(p) \isdef -\int_a^b p(x)\, \lg p(x)\, dx$](img2796.png) |
(D.33) |
with respect to
, subject to the constraints
Using the method of Lagrange multipliers for optimization in
the presence of constraints [86], we may form the
objective function
![$\displaystyle J(p) \isdef -\int_a^b p(x) \, \ln p(x) \,dx + \lambda_0\left(\int_a^b p(x)\,dx - 1\right)$](img2798.png) |
(D.34) |
and differentiate with respect to
(and renormalize by dropping the
factor multiplying all terms) to obtain
![$\displaystyle \frac{\partial}{\partial p(x)\,dx} J(p) = - \ln p(x) - 1 + \lambda_0.$](img2800.png) |
(D.35) |
Setting this to zero and solving for
gives
![$\displaystyle p(x) = e^{\lambda_0-1}.$](img2801.png) |
(D.36) |
(Setting the partial derivative with respect to
to zero
merely restates the constraint.)
Choosing
to satisfy the constraint gives
, yielding
![$\displaystyle p(x) = \left\{\begin{array}{ll} \frac{1}{b-a}, & a\leq x \leq b \\ [5pt] 0, & \hbox{otherwise}. \\ \end{array} \right.$](img2803.png) |
(D.37) |
That this solution is a maximum rather than a minimum or inflection
point can be verified by ensuring the sign of the second partial
derivative is negative for all
:
![$\displaystyle \frac{\partial^2}{\partial p(x)^2dx} J(p) = - \frac{1}{p(x)}$](img2804.png) |
(D.38) |
Since the solution spontaneously satisfied
, it is a maximum.
Next |
Prev |
Up |
Top
|
Index |
JOS Index |
JOS Pubs |
JOS Home |
Search
[How to cite this work] [Order a printed hardcopy] [Comment on this page via email]