Next  |  Prev  |  Up  |  Top  |  Index  |  JOS Index  |  JOS Pubs  |  JOS Home  |  Search

Computation of Linear Prediction Coefficients

In the autocorrelation method of linear prediction, the linear prediction coefficients $ \{a_i\}_{i=1}^M$ are computed from the Bartlett-window-biased autocorrelation function (Chapter 6):

$\displaystyle r_{y_m}(l) \isdefs \sum_{n=-\infty}^\infty y_m(n)y_m(n+l) \eqsp \hbox{\sc DFT}^{-1}\left\vert Y_m\right\vert^2 \protect$ (11.11)

where $ y_m$ denotes the $ m$ th data frame from the signal $ y$ . To obtain the $ M$ th-order linear predictor coefficients $ \{a_1,\ldots,a_M\}$ , we solve the following $ M\times M$ system of linear normal equations (also called Yule-Walker or Wiener-Hopf equations):

$\displaystyle \sum_{i=1}^M a_i r_{y_m}(\vert i-j\vert) \eqsp -r_{y_m}(j), \qquad j=1,2,\ldots,M \protect$ (11.12)

In matlab syntax, the solution is given by `` $ \verb+a=R\p+$ '', where $ \verb+p(j)+ = r_{y_m}(j)$ , and $ \verb+R(i,j)+=r_{y_m}(\vert i-j\vert)$ . Since the covariance matrix $ R$ is symmetric and Toeplitz by construction,11.4 an $ O(M^2)$ solution exists using the Durbin recursion.11.5

If the rank of the $ M\times M$ autocorrelation matrix $ R[i,j]=r_{y_n}(\vert i-j\vert)$ is $ M$ , then the solution to (10.12) is unique, and this solution is always minimum phase [162] (i.e., all roots of $ A(z)$ are inside the unit circle in the $ z$ plane [263], so that $ 1/A(z)$ is always a stable all-pole filter). In practice, the rank of $ R$ is $ M$ (with probability 1) whenever $ y(n)$ includes a noise component. In the noiseless case, if $ y(n)$ is a sum of sinusoids, each (real) sinusoid at distinct frequency $ 0<\omega_i T
< \pi$ adds 2 to the rank. A dc component, or a component at half the sampling rate, adds 1 to the rank of $ R$ .

The choice of time window for forming a short-time sample autocorrelation and its weighting also affect the rank of $ R$ . Equation (10.11) applied to a finite-duration frame yields what is called the autocorrelation method of linear prediction [162]. Dividing out the Bartlett-window bias in such a sample autocorrelation yields a result closer to the covariance method of LP. A matlab example is given in §10.3.3 below.

The classic covariance method computes an unbiased sample covariance matrix by limiting the summation in (10.11) to a range over which $ y_m(n+l)$ stays within the frame--a so-called ``unwindowed'' method. The autocorrelation method sums over the whole frame and replaces $ y_m(n+l)$ by zero when $ n+l$ points outside the frame--a so-called ``windowed'' method (windowed by the rectangular window).


Next  |  Prev  |  Up  |  Top  |  Index  |  JOS Index  |  JOS Pubs  |  JOS Home  |  Search

[How to cite this work]  [Order a printed hardcopy]  [Comment on this page via email]

``Spectral Audio Signal Processing'', by Julius O. Smith III, W3K Publishing, 2011, ISBN 978-0-9745607-3-1.
Copyright © 2022-02-28 by Julius O. Smith III
Center for Computer Research in Music and Acoustics (CCRMA),   Stanford University
CCRMA