Abstract
A fully adaptive normalized nonlinear gradient descent (FANNGD) algorithm for online adaptation of nonlinear neural filters is proposed. An adaptive stepsize that minimizes the instantaneous output error of the filter is derived using a linearization performed by a Taylor series expansion of the output error. For rigor, the remainder of the truncated Taylor series expansion within the expression for the adaptive learning rate is made adaptive and is updated using gradient descent. The FANNGD algorithm is shown to converge faster than previously introduced algorithms of this kind
Original language | English |
---|---|
Pages (from-to) | 295-297 |
Number of pages | 3 |
Journal | IEEE Signal Processing Letters |
Volume | 8 |
Issue number | 11 |
DOIs | |
Publication status | Published - 2001 |