MPC

MPC 2013, ISSUE 2



Mathematical Programming Computation, Volume 5, Issue 2, June 2013

Trajectory-following methods for large-scale degenerate convex quadratic programming

Nicholas I. M. Gould, Dominique Orban, Daniel P. Robinson

We consider a class of infeasible, path-following methods for convex quadratric programming. Our methods are designed to be effective for solving both nondegerate and degenerate problems, where degeneracy is understood to mean the failure of strict complementarity at a solution. Global convergence and a polynomial bound on the number of iterations required is given. An implementation, CQP, is available as part of GALAHAD. We illustrate the advantages of our approach on the CUTEr and Maros–Meszaros test sets.

Full Text: PDF



Mathematical Programming Computation, Volume 5, Issue 2, June 2013

Efficient block-coordinate descent algorithms for the Group Lasso

Zhiwei Qin, Katya Scheinberg, Donald Goldfarb

We present two algorithms to solve the Group Lasso problem (Yuan and Lin in, J R Stat Soc Ser B (Stat Methodol) 68(1):49–67, 2006). First, we propose a general version of the Block Coordinate Descent (BCD) algorithm for the Group Lasso that employs an efficient approach for optimizing each subproblem exactly.We show that it exhibits excellent performance when the groups are of moderate size. For groups of large size, we propose an extension of ISTA/FISTA SIAM (Beck and Teboulle in, SIAM J Imag Sci 2(1):183–202, 2009) based on variable step-lengths that can be viewed as a simplified version of BCD. By combining the two approaches we obtain an implementation that is very competitive and often outperforms other state-of-the-art approaches for this problem. We show how these methods fit into the globally convergent general block coordinate gradient descent framework in Tseng and Yun (Math Program 117(1):387–423, 2009). We also show that the proposed approach is more efficient in practice than the one implemented in Tseng and Yun (Math Program 117(1):387–423, 2009). In addition, we apply our algorithms to the Multiple Measurement Vector (MMV) recovery problem, which can be viewed as a special case of the Group Lasso problem, and compare their performance to other methods in this particular instance.

Full Text: PDF



Mathematical Programming Computation, Volume 5, Issue 2, June 2013

Local cuts for mixed-integer programming

Vašek Chvátal, William Cook, Daniel Espinoza

A general framework for cutting-plane generation was proposed by Applegate et al. in the context of the traveling salesman problem. The process considers the image of a problem space under a linear mapping, chosen so that a relaxation of the mapped problem can be solved efficiently. Optimization in the mapped space can be used to find a separating hyperplane, if one exists, and via substitution this gives a cutting plane in the original space.We extend this procedure to general mixed-integer programming problems, obtaining a range of possibilities for new sources of cutting planes. Some of these possibilities are explored computationally, both in floating-point arithmetic and in rational arithmetic.

Full Text: PDF



Mathematical Programming Computation, Volume 5, Issue 2, June 2013

Parallel stochastic gradient algorithms for large-scale matrix completion

Benjamin Recht, Christopher Ré

This paper develops Jellyfish, an algorithm for solving data-processing problems with matrix-valued decision variables regularized to have low rank. Particular examples of problems solvable by Jellyfish include matrix completion problems and least-squares problems regularized by the nuclear norm or ?2-norm. Jellyfish implements a projected incremental gradient method with a biased, random ordering of the increments. This biased ordering allows for a parallel implementation that admits a speed-up nearly proportional to the number of processors. On large-scale matrix completion tasks, Jellyfish is orders of magnitude more efficient than existing codes. For example, on the Netflix Prize data set, prior art computes rating predictions in approximately 4 h, while Jellyfish solves the same problem in under 3 min on a 12 core workstation.

Full Text: PDF




Imprint and privacy statement

For the imprint and privacy statement we refer to the Imprint of ZIB.
© 2008-2024 by Zuse Institute Berlin (ZIB).