Next  |  Prev  |  Up  |  Top  |  JOS Index  |  JOS Pubs  |  JOS Home  |  Search

Entropy of a Probability Distribution

The entropy of a probability density function (PDF) $ p(x)$ is defined as

$\displaystyle \zbox{h(p) \isdef \int_x p(x) \cdot \mbox{lg}\left[\frac{1}{p(x)}\right] dx}
$

where lg denotes the logarithm base 2. The entropy of $ p(x)$ can be interpreted as the average number of bits needed to specify random variables $ x$ drawn at random according to $ p(x)$ :

$\displaystyle h(p) = {\cal E}_p\left\{\mbox{lg}\left[\frac{1}{p(x)}\right]\right\}
$

The term lg$ [1/p(x)]$ can be viewed the number of bits which should be assigned to the value $ x$ . (The most common values of $ x$ should be assigned the fewest bits, while rare values can be assigned many bits.)


Next  |  Prev  |  Up  |  Top  |  JOS Index  |  JOS Pubs  |  JOS Home  |  Search

[Comment on this page via email]

``Gaussian Windows and Transforms'', by Julius O. Smith III, (From Lecture Overheads, Music 421).
Copyright © 2020-06-06 by Julius O. Smith III
Center for Computer Research in Music and Acoustics (CCRMA),   Stanford University
CCRMA  [Automatic-links disclaimer]