Taylor and Francis Group, Journal of Computational and Graphical Statistics, 4(20), p. 972-987
Full text: Download
We show that the homotopy algorithm of Osborne, Presnell, and Turlach (2000), which has proved such an effective optimal path following method for implementing Tibshirani’s “lasso” for variable selection in least squares estimation problems, can be extended to polyhedral objectives in examples such as the quantile regression lasso. The new algorithm introduces the novel feature that it requires two homotopy sequences involving continuation steps with respect to both the constraint bound and the Lagrange multiplier to be performed consecutively. Performance is illustrated by application to several standard datasets, and these results are compared to calculations made with the original lasso homotopy program. This permits an assessment of the computational complexity to be made both for the new method and for the closely related linear programming post-optimality procedures as these generate essentially identical solution trajectories. This comparison strongly favors the least squares selection method. However, the new method still provides an effective computational procedure, plus it has distinct implementation advantages over the linear programming approaches to the polyhedral objective problem. The computational difficulty is explained and the problem that needs to be resolved in order to improve performance identified. An online supplement to the article contains proofs and R code to implement the algorithm.