Abstract
Multinomial logistic regression provides the standard penalised maximum likelihood
solution to multi-class pattern recognition problems. More recently,
the development of sparse multinomial logistic regression models has found application
in text processing and microarray classification, where explicit identification
of the most informative features is of value. In this paper, we propose a
sparse multinomial logistic regression method, in which the sparsity arises from
the use of a Laplace prior, but where the usual regularisation parameter is integrated
out analytically. Evaluation over a range of benchmark datasets reveals
this approach results in similar generalisation performance to that obtained using
cross-validation, but at greatly reduced computational expense.
Original language | English |
---|---|
Title of host publication | Advances in Neural Information Processing Systems |
Editors | Bernhard Schölkopf, John Platt, Thomas Hofmann |
Publisher | MIT Press |
Pages | 209-216 |
Number of pages | 8 |
Volume | 19 |
ISBN (Print) | 9780262195683 |
Publication status | Published - 2007 |