Automatically identifying relevant variables for linear regression with the Lasso method: A methodological primer for its application with R and a performance contrast simulation with alternative selection strategies

Sebastian Scherr, Jing Zhou

Research output: Contribution to journalArticlepeer-review

2 Citations (Scopus)


The abundance of available digital big data has created new challenges in identifying relevant variables for regression models. One statistical problem that gained relevance in the era of big data is high-dimensional statistical inference, when the number of variables greatly exceeds the number of observations. Typically, prediction errors in linear regression skyrocket when the number of included variables gets close to the number of observations, and ordinary least squares (OLS) regression no longer works in a high-dimensional scenario. Regularized estimators as a feasible solution include the Least Absolute Shrinkage and Selection Operator (Lasso), which we introduce to communication scholars here. We will include the statistical background of this technique that combines estimation and variable selection simultaneously and helps identify relevant variables for regression models in high-dimensional scenarios. We contrast the Lasso with two alternative strategies of selecting variables for regression models, namely, a theory-based “subset selection” of variables and a nonselective “all in” strategy. The simulation shows that the Lasso produces lower and more relatively stable prediction errors than the two alternative variable selection strategies, and it is therefore recommended to use, especially in high-dimensional settings typical in times of big data analysis.
Original languageEnglish
Pages (from-to)204-211
Number of pages8
JournalCommunication Methods and Measures
Issue number3
Early online date28 Oct 2019
Publication statusPublished - 2 Jul 2020

Cite this