A normalized gradient descent algorithm for nonlinear adaptive filters using a gradient adaptive step size

D. P. Mandic, A. I. Hanna, M. Razaz

Research output: Contribution to journalArticlepeer-review

47 Citations (Scopus)


A fully adaptive normalized nonlinear gradient descent (FANNGD) algorithm for online adaptation of nonlinear neural filters is proposed. An adaptive stepsize that minimizes the instantaneous output error of the filter is derived using a linearization performed by a Taylor series expansion of the output error. For rigor, the remainder of the truncated Taylor series expansion within the expression for the adaptive learning rate is made adaptive and is updated using gradient descent. The FANNGD algorithm is shown to converge faster than previously introduced algorithms of this kind
Original languageEnglish
Pages (from-to)295-297
Number of pages3
JournalIEEE Signal Processing Letters
Issue number11
Publication statusPublished - 2001

Cite this