SCS
SCS: Splitting conic solver: Conic Optimization via Operator Splitting and Homogeneous Self-Dual Embedding. We introduce a first order method for solving very large cone programs to modest accuracy. The method uses an operator splitting method, the alternating directions method of multipliers, to solve the homogeneous self-dual embedding, an equivalent feasibility problem involving finding a nonzero point in the intersection of a subspace and a cone. This approach has several favorable properties. Compared to interior-point methods, first-order methods scale to very large problems, at the cost of lower accuracy. Compared to other first-order methods for cone programs, our approach finds both primal and dual solutions when available and certificates of infeasibility or unboundedness otherwise, it does not rely on any explicit algorithm parameters, and the per-iteration cost of the method is the same as applying the splitting method to the primal or dual alone. We discuss efficient implementation of the method in detail, including direct and indirect methods for computing projection onto the subspace, scaling the original problem data, and stopping criteria. We describe an open-source implementation called SCS, which handles the usual (symmetric) nonnegative, second-order, and semidefinite cones as well as the (non-self-dual) exponential and power cones and their duals. We report numerical results that show speedups over interior-point cone solvers for large SOCPs, and scaling to very large general cone programs.
Keywords for this software
References in zbMATH (referenced in 47 articles )
Showing results 1 to 20 of 47.
Sorted by year (- Askari, Armin; Rebjock, Quentin; d’Aspremont, Alexandre; El Ghaoui, Laurent: FANOK: knockoffs in linear time (2021)
- Barratt, Shane; Angeris, Guillermo; Boyd, Stephen: Automatic repair of convex optimization problems (2021)
- Chen, Run; Liu, Andrew L.: A distributed algorithm for high-dimension convex quadratically constrained quadratic programs (2021)
- Ding, Lijun; Yurtsever, Alp; Cevher, Volkan; Tropp, Joel A.; Udell, Madeleine: An optimal-storage approach to semidefinite programming using approximate complementarity (2021)
- Garstka, Michael; Cannon, Mark; Goulart, Paul: COSMO: a conic operator splitting method for convex conic problems (2021)
- Glaser, Lisa; Stern, Abel B.: Reconstructing manifolds from truncations of spectral triples (2021)
- Gnacik, Michał; Guzik, Marcin; Kania, Tomasz: Approximate modularity: Kalton’s constant is not smaller than 3 (2021)
- Kaluba, Marek; Kielak, Dawid; Nowak, Piotr: On property (T) for (\Aut(F_n)) and (\mathrmSL_n(\mathbbZ)) (2021)
- Lin, Tianyi; Ma, Shiqian; Ye, Yinyu; Zhang, Shuzhong: An ADMM-based interior-point method for large-scale linear programming (2021)
- Moehle, Nicholas; Kochenderfer, Mykel J.; Boyd, Stephen; Ang, Andrew: Tax-aware portfolio construction via convex optimization (2021)
- O’Donoghue, Brendan: Operator splitting for a homogeneous embedding of the linear complementarity problem (2021)
- Schwendinger, Florian; Grün, Bettina; Hornik, Kurt: A comparison of optimization solvers for log binomial regression including conic programming (2021)
- Coey, Chris; Lubin, Miles; Vielma, Juan Pablo: Outer approximation with conic certificates for mixed-integer convex problems (2020)
- Eltved, Anders; Dahl, Joachim; Andersen, Martin S.: On the robustness and scalability of semidefinite relaxation for optimal power flow problems (2020)
- Fu, Anqi; Zhang, Junzi; Boyd, Stephen: Anderson accelerated Douglas-Rachford splitting (2020)
- Hütter, Jan-Christian; Mao, Cheng; Rigollet, Philippe; Robeva, Elina: Optimal rates for estimation of two-dimensional totally positive distributions (2020)
- Hütter, Jan-Christian; Mao, Cheng; Rigollet, Philippe; Robeva, Elina: Estimation of Monge matrices (2020)
- Liberti, Leo: Distance geometry and data science (2020)
- Li, Yongfeng; Liu, Haoyang; Wen, Zaiwen; Yuan, Ya-xiang: Low-rank matrix iteration using polynomial-filtered subspace extraction (2020)
- Milz, Johannes; Ulbrich, Michael: An approximation scheme for distributionally robust nonlinear optimization (2020)