The Gaussian Hare and the Laplacian Tortoise:
Computability of squared-error vs. absolute-error estimators
Stephen Portnoy and Roger Koenker
Since Gauss(1821), it has been generally accepted that l_2
methods of combining observations by minimizing sums of squared errors have
significant computational advantages over earlier l_1 methods based on
minimization of absolute errors advocated by Boscovich, Laplace and others.
However, l_1 methods are known to have significant robustness
advantages over l_2 methods in many applications, and related quantile
regression methods provide a useful, complementary approach to classical
least-squares estimation of statistical models.
Combining recent advances in interior point methods for solving linear
programs with a new statistical preprocessing approach for l_1-type
problems, we obtain a 10 to 100 fold improvement in computational speeds over
current (simplex-based) l_1 algorithms in large problems,
demonstrating that l_1 methods can be made competitive with l_2
methods in terms of computational speed throughout the entire range
of problem sizes. Formal complexity results are given which suggest
that l_1 regression can be made faster than least squares regression
for n sufficiently large, and p modest.
Department of Statistics
University of Illinois Urbana-Champaign