A normalized gradient descent algorithm for nonlinear adaptive filters using a gradient adaptive step size

D. P. Mandic, A. I. Hanna, M. Razaz

Research output: Contribution to journalArticlepeer-review

48 Citations (Scopus)

Abstract

A fully adaptive normalized nonlinear gradient descent (FANNGD) algorithm for online adaptation of nonlinear neural filters is proposed. An adaptive stepsize that minimizes the instantaneous output error of the filter is derived using a linearization performed by a Taylor series expansion of the output error. For rigor, the remainder of the truncated Taylor series expansion within the expression for the adaptive learning rate is made adaptive and is updated using gradient descent. The FANNGD algorithm is shown to converge faster than previously introduced algorithms of this kind
Original languageEnglish
Pages (from-to)295-297
Number of pages3
JournalIEEE Signal Processing Letters
Volume8
Issue number11
DOIs
Publication statusPublished - 2001

Cite this