Next  |  Prev  |  Up  |  Top  |  Index  |  JOS Index  |  JOS Pubs  |  JOS Home  |  Search

L-Infinity Norm of Derivative Objective

We can add a smoothness objective by adding $ \ensuremath{L_\infty}$ -norm of the derivative to the objective function.

$\displaystyle \mathrm{minimize}\quad \delta +\eta \left\Vert \Delta h\right\Vert _{\infty }.$ (4.79)

The $ \ensuremath{L_\infty}$ -norm only cares about the maximum derivative. Large $ \eta $ means we put more weight on the smoothness than the side-lobe level.

This can be formulated as an LP by adding one optimization parameter $ \sigma $ which bounds all derivatives.

$\displaystyle -\sigma \leq \Delta h_{i}\leq \sigma \qquad i=1,\ldots ,L-1.$ (4.80)

In matrix form,
$\displaystyle \left[\begin{array}{r}
-\mathbf{D}\\
\mathbf{D}\end{array}\right]h-\sigma \mathbf1$ $\displaystyle \le$ $\displaystyle 0.$  

Objective function becomes

$\displaystyle \mathrm{minimize}\quad \delta +\eta \sigma .$ (4.81)

The result of adding the Chebyshev norm of diff(h) to the objective function to be minimized ($ \eta =1$ ) is shown in Fig.3.39. The result of increasing $ \eta $ to 20 is shown in Fig.3.40.

Figure: Chebyshev norm of diff(h) added to the objective function to be minimized ($ \eta =1$ )
\includegraphics[width=\twidth,height=6.5in]{eps/print_linf_chebwin_1}

Figure: Twenty times the norm of diff(h) added to the objective function to be minimized ($ \eta =20$ )
\includegraphics[width=\twidth,height=6.5in]{eps/print_linf_chebwin_2}


Next  |  Prev  |  Up  |  Top  |  Index  |  JOS Index  |  JOS Pubs  |  JOS Home  |  Search

[How to cite this work]  [Order a printed hardcopy]  [Comment on this page via email]

``Spectral Audio Signal Processing'', by Julius O. Smith III, W3K Publishing, 2011, ISBN 978-0-9745607-3-1.
Copyright © 2022-02-28 by Julius O. Smith III
Center for Computer Research in Music and Acoustics (CCRMA),   Stanford University
CCRMA