Next  |  Prev  |  Up  |  Top  |  Index  |  JOS Index  |  JOS Pubs  |  JOS Home  |  Search

Entropy of a Probability Distribution

The entropy of a probability density function (PDF) $ p(x)$ is defined as [48]

$\displaystyle \zbox {h(p) \isdef \int_x p(x) \cdot \lg\left[\frac{1}{p(x)}\right] dx}$ (D.29)

where $ \lg$ denotes the logarithm base 2. The entropy of $ p(x)$ can be interpreted as the average number of bits needed to specify random variables $ x$ drawn at random according to $ p(x)$ :

$\displaystyle h(p) = {\cal E}_p\left\{\lg \left[\frac{1}{p(x)}\right]\right\}$ (D.30)

The term $ \lg[1/p(x)]$ can be viewed as the number of bits which should be assigned to the value $ x$ . (The most common values of $ x$ should be assigned the fewest bits, while rare values can be assigned many bits.)


Next  |  Prev  |  Up  |  Top  |  Index  |  JOS Index  |  JOS Pubs  |  JOS Home  |  Search

[How to cite this work]  [Order a printed hardcopy]  [Comment on this page via email]

``Spectral Audio Signal Processing'', by Julius O. Smith III, W3K Publishing, 2011, ISBN 978-0-9745607-3-1.
Copyright © 2022-02-28 by Julius O. Smith III
Center for Computer Research in Music and Acoustics (CCRMA),   Stanford University
CCRMA