Next  |  Prev  |  Top  |  JOS Index  |  JOS Pubs  |  JOS Home  |  Search

L-Infinity Norm of Derivative Objective

We can add a smoothness objective by adding $ L_{\infty }$ -norm of the first-order difference to the objective function.

$\displaystyle \mathrm{minimize}\quad \delta +\eta \left\Vert \Delta h\right\Vert _{\infty }.
$

Let $ \sigma\mathrel{\stackrel{\mathrm{\Delta}}{=}}\left\Vert \Delta h\right\Vert _{\infty }$ and set up the inequality constraints

$\displaystyle -\sigma \leq \Delta h_{i}\leq \sigma \qquad i=1,\ldots ,L-1.
$

In matrix form:
$\displaystyle \left[\begin{array}{c}
-\mathbf{D}\\
\mathbf{D}\end{array}\right]h-\sigma \mathbf1$ $\displaystyle \le$ $\displaystyle 0.$  

The objective function is then

$\displaystyle \mathrm{minimize}\quad \delta +\eta \sigma.
$

Chebyshev norm of diff(h) added to the objective function to be minimized ($ \eta=1$ ):

\epsfig{file=eps/print_linf_chebwin_1.eps,width=6in,height=6.5in}

Twenty times the Chebyshev norm of diff(h) added to the objective function to be minimized ($ \eta=20$ ):

\epsfig{file=eps/print_linf_chebwin_2.eps,width=6in,height=6.5in}


Next  |  Prev  |  Top  |  JOS Index  |  JOS Pubs  |  JOS Home  |  Search

[Comment on this page via email]

``Optimal Window Design by Linear Programming'', by Tatsuki Kashitani, (Music 421 Presentation, Music 421).
Copyright © 2020-06-27 by Tatsuki Kashitani
Center for Computer Research in Music and Acoustics (CCRMA),   Stanford University
CCRMA  [Automatic-links disclaimer]