Next |
Prev |
Up |
Top
|
Index |
JOS Index |
JOS Pubs |
JOS Home |
Search
The entropy of a probability density function (PDF)
is
defined as [48]
![$\displaystyle \zbox {h(p) \isdef \int_x p(x) \cdot \lg\left[\frac{1}{p(x)}\right] dx}$](img2785.png) |
(D.29) |
where
denotes the logarithm base 2. The entropy of
can
be interpreted as the average number of bits needed to specify random
variables
drawn at random according to
:
![$\displaystyle h(p) = {\cal E}_p\left\{\lg \left[\frac{1}{p(x)}\right]\right\}$](img2787.png) |
(D.30) |
The term
can be viewed as the number of bits which should
be assigned to the value
. (The most common values of
should
be assigned the fewest bits, while rare values can be assigned many
bits.)
Next |
Prev |
Up |
Top
|
Index |
JOS Index |
JOS Pubs |
JOS Home |
Search
[How to cite this work] [Order a printed hardcopy] [Comment on this page via email]