Efficient least angle regression for identification of linear-in-the-parameters models

Wanqing Zhao, Thomas H. Beach, Yacine Rezgui

Research output: Contribution to journalArticlepeer-review

14 Citations (SciVal)

Abstract

Least angle regression, as a promising model selection method, differentiates itself from conventional stepwise and stagewise methods, in that it is neither too greedy nor too slow. It is closely related to L1 norm optimization, which has the advantage of low prediction variance through sacrificing part of model bias property in order to enhance model generalization capability. In this paper, we propose an efficient least angle regression algorithm for model selection for a large class of linear-in-the-parameters models with the purpose of accelerating the model selection process. The entire algorithm works completely in a recursive manner, where the correlations between model terms and residuals, the evolving directions and other pertinent variables are derived explicitly and updated successively at every subset selection step. The model coefficients are only computed when the algorithm finishes. The direct involvement of matrix inversions is thereby relieved. A detailed computational complexity analysis indicates that the proposed algorithm possesses significant computational efficiency, compared with the original approach where the well-known efficient Cholesky decomposition is involved in solving least angle regression. Three artificial and real-world examples are employed to demonstrate the effectiveness, efficiency and numerical stability of the proposed algorithm.
Original languageEnglish
Article number20160775
JournalProceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences
Volume473
Issue number2198
Early online date1 Feb 2017
DOIs
Publication statusPublished - 28 Feb 2017

Cite this