A Brief Survey of Modern Optimization for Statisticians

Int Stat Rev. 2014 Apr 1;82(1):46-70. doi: 10.1111/insr.12022.

Abstract

Modern computational statistics is turning more and more to high-dimensional optimization to handle the deluge of big data. Once a model is formulated, its parameters can be estimated by optimization. Because model parsimony is important, models routinely include nondifferentiable penalty terms such as the lasso. This sober reality complicates minimization and maximization. Our broad survey stresses a few important principles in algorithm design. Rather than view these principles in isolation, it is more productive to mix and match them. A few well chosen examples illustrate this point. Algorithm derivation is also emphasized, and theory is downplayed, particularly the abstractions of the convex calculus. Thus, our survey should be useful and accessible to a broad audience.

Keywords: Block relaxation; MM algorithm; Newton’s Method; acceleration; augmented Lagrangian; penalization.