Next  |  Prev  |  Up  |  Top  |  Index  |  JOS Index  |  JOS Pubs  |  JOS Home  |  Search


Newton's Method of Nonlinear Minimization

Newton's method [163],[167, p. 143] finds the minimum of a nonlinear (scalar) function of several variables by locally approximating the function by a quadratic surface, and then stepping to the bottom of that ``bowl'', which generally requires a matrix inversion. Newton's method therefore requires the function to be ``close to quadratic'', and its effectiveness is directly tied to the accuracy of that assumption. For smooth functions, Newton's method gives very rapid quadratic convergence in the last stages of iteration. Quadratic convergence implies, for example, that the number of significant digits in the minimizer approximately doubles each iteration.

Newton's method may be derived as follows: Suppose we wish to minimize the real, positive function $ J(\underline{x})$ with respect to $ \underline{x}$ . The vector Taylor expansion [548] of $ J(\underline{x})$ about $ \underline{\hat{x}}$ gives

$\displaystyle J(\underline{\hat{x}}^\ast ) = J(\underline{\hat{x}}) + J^\prime(\underline{\hat{x}}) \left(\underline{\hat{x}}^\ast -\underline{\hat{x}}\right)+ \frac{1}{
2}
\left(\underline{\hat{x}}^\ast -\underline{\hat{x}}\right)^T J''\left(\lambda\underline{\hat{x}}^\ast +\overline{\lambda}\underline{\hat{x}}\right)
\left(\underline{\hat{x}}^\ast -\underline{\hat{x}}\right),
$

for some $ 0\leq\lambda\leq 1$ , where $ \overline{\lambda}\isdef
1-\lambda$ . It is now necessary to assume that $ J''\left(
\lambda\underline{\hat{x}}^\ast +\overline{\lambda}\underline{\hat{x}}\right)\approx J''(\underline{\hat{x}})$ . Differentiating with respect to $ \underline{\hat{x}}^\ast $ , where $ J(\underline{\hat{x}}^\ast )$ is presumed to be minimized, this becomes

$\displaystyle 0 = 0 + J^\prime(\underline{\hat{x}}) + J''(\underline{\hat{x}})
\left(\underline{\hat{x}}^\ast -\underline{\hat{x}}\right).
$

Solving for $ \underline{\hat{x}}^\ast $ yields

$\displaystyle \underline{\hat{x}}^\ast = \underline{\hat{x}}- [J''(\underline{\hat{x}})]^{-1} J^\prime(\underline{\hat{x}}).$ (8.13)

Applying Eq.$ \,$ (7.13) iteratively, we obtain Newton's method:

$\displaystyle \underline{\hat{x}}_{n+1} = \underline{\hat{x}}_n - [J''(\underline{\hat{x}}_n)]^{-1} J^\prime(\underline{\hat{x}}_n), \quad n=1,2,\ldots\,,$ (8.14)

where $ \underline{\hat{x}}_0$ is given as an initial condition.

When the $ J(\underline{\hat{x}})$ is any quadratic form in $ \underline{\hat{x}}$ , then $ J''\left(\lambda\underline{\hat{x}}^\ast +\overline{\lambda}\underline{\hat{x}}\right)= J''(\underline{\hat{x}})$ , and Newton's method produces $ \underline{\hat{x}}^\ast $ in one iteration; that is, $ \underline{\hat{x}}_1=\underline{\hat{x}}^\ast $ for every $ \underline{\hat{x}}_0$ .


Next  |  Prev  |  Up  |  Top  |  Index  |  JOS Index  |  JOS Pubs  |  JOS Home  |  Search

[How to cite this work]  [Order a printed hardcopy]  [Comment on this page via email]

``Physical Audio Signal Processing'', by Julius O. Smith III, W3K Publishing, 2010, ISBN 978-0-9745607-2-4.
Copyright © 2014-06-11 by Julius O. Smith III
Center for Computer Research in Music and Acoustics (CCRMA),   Stanford University
CCRMA