On the Minimization Over Sparse Symmetric Sets: Projections, Optimality Conditions, and Algorithms

From MaRDI portal
Publication:2800370

DOI10.1287/moor.2015.0722zbMath1334.90129OpenAlexW1875533481MaRDI QIDQ2800370

Amir Beck, Nadav Hallak

Publication date: 15 April 2016

Published in: Mathematics of Operations Research (Search for Journal in Brave)

Full work available at URL: https://semanticscholar.org/paper/84975e6dad800be2d22a599de8ba6d6a174f2fea



Related Items

Critical point theory for sparse recovery, Optimality conditions for sparse nonlinear programming, The sparse principal component analysis problem: optimality conditions and algorithms, A survey on compressive sensing: classical results and recent advancements, Forward-Backward Envelope for the Sum of Two Nonconvex Functions: Further Properties and Nonmonotone Linesearch Algorithms, Restricted Robinson constraint qualification and optimality for cardinality-constrained cone programming, Doubly majorized algorithm for sparsity-inducing optimization problems with regularizer-compatible constraints, Solution sets of three sparse optimization problems for multivariate regression, A branch and bound method solving the max–min linear discriminant analysis problem, A unifying framework for sparsity-constrained optimization, Normal Cones Intersection Rule and Optimality Analysis for Low-Rank Matrix Optimization with Affine Manifolds, Unnamed Item, A Path-Based Approach to Constrained Sparse Optimization, New insights on the optimality conditions of the \(\ell_2-\ell_0\) minimization problem, Proximal Mapping for Symmetric Penalty and Sparsity, Unnamed Item, The Analysis of Alternating Minimization Method for Double Sparsity Constrained Optimization Problem, A gradient projection algorithm with a new stepsize for nonnegative sparsity-constrained optimization, Lagrangian duality and saddle points for sparse linear programming, Orbital geometry and group majorisation in optimisation, An extended Newton-type algorithm for \(\ell_2\)-regularized sparse logistic regression and its efficiency for classifying large-scale datasets, Nonconvex Lagrangian-Based Optimization: Monitoring Schemes and Global Convergence, Weighted thresholding homotopy method for sparsity constrained optimization, Optimization problems involving group sparsity terms, A Lagrange-Newton algorithm for sparse nonlinear programming, Quadratic Convergence of Smoothing Newton's Method for 0/1 Loss Optimization, On nondegenerate M-stationary points for sparsity constrained nonlinear optimization, The first-order necessary conditions for sparsity constrained optimization


Uses Software


Cites Work