Sparsity Constrained Nonlinear Optimization: Optimality Conditions and Algorithms

From MaRDI portal
Revision as of 20:25, 3 February 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:2866194


DOI10.1137/120869778zbMath1295.90051arXiv1203.4580MaRDI QIDQ2866194

Yonina C. Eldar, Amir Beck

Publication date: 13 December 2013

Published in: SIAM Journal on Optimization (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1203.4580


90C26: Nonconvex programming, global optimization

90C46: Optimality conditions and duality in mathematical programming


Related Items

Unnamed Item, First Order Methods Beyond Convexity and Lipschitz Gradient Continuity with Applications to Quadratic Inverse Problems, Gradient-based method with active set strategy for $\ell _1$ optimization, Proximal Mapping for Symmetric Penalty and Sparsity, Least Sparsity of $p$-Norm Based Optimization Problems with $p>1$, Unnamed Item, Fast Best Subset Selection: Coordinate Descent and Local Combinatorial Optimization Algorithms, Unnamed Item, Unnamed Item, Quadratic Convergence of Smoothing Newton's Method for 0/1 Loss Optimization, First-Order Algorithms for a Class of Fractional Optimization Problems, Sparsity constrained optimization problems via disjunctive programming, A penalty decomposition approach for multi-objective cardinality-constrained optimization problems, MIP-BOOST: Efficient and Effective L0 Feature Selection for Linear Regression, The smoothing objective penalty function method for two-cardinality sparse constrained optimization problems, The Analysis of Alternating Minimization Method for Double Sparsity Constrained Optimization Problem, Optimal $k$-Thresholding Algorithms for Sparse Optimization Problems, Nonconvex Lagrangian-Based Optimization: Monitoring Schemes and Global Convergence, Quasi-linear Compressed Sensing, An Inexact Projected Gradient Method for Sparsity-Constrained Quadratic Measurements Regression, Convex Optimization and Parsimony of $L_p$-balls Representation, Mathematical Programs with Cardinality Constraints: Reformulation by Complementarity-Type Conditions and a Regularization Method, Critical point theory for sparse recovery, A survey on compressive sensing: classical results and recent advancements, Unnamed Item, An augmented Lagrangian method for optimization problems with structured geometric constraints, Improved RIP-based bounds for guaranteed performance of two compressed sensing algorithms, Grouped variable selection with discrete optimization: computational and statistical perspectives, Doubly majorized algorithm for sparsity-inducing optimization problems with regularizer-compatible constraints, Sparse optimization via vector \(k\)-norm and DC programming with an application to feature selection for support vector machines, Solution sets of three sparse optimization problems for multivariate regression, A Bregman stochastic method for nonconvex nonsmooth problem beyond global Lipschitz gradient continuity, A unifying framework for sparsity-constrained optimization, Normal Cones Intersection Rule and Optimality Analysis for Low-Rank Matrix Optimization with Affine Manifolds, Inexact penalty decomposition methods for optimization problems with geometric constraints, Nomonotone spectral gradient method for sparse recovery, On solutions of sparsity constrained optimization, The first-order necessary conditions for sparsity constrained optimization, The sparse principal component analysis problem: optimality conditions and algorithms, The non-convex sparse problem with nonnegative constraint for signal reconstruction, Constraint qualifications and optimality conditions for optimization problems with cardinality constraints, Total variation reconstruction from quadratic measurements, Nonlinear frames and sparse reconstructions in Banach spaces, A quadratic penalty method for hypergraph matching, A new conjugate gradient hard thresholding pursuit algorithm for sparse signal recovery, Convergence of a Scholtes-type regularization method for cardinality-constrained optimization problems with an application in sparse robust portfolio optimization, Second-order optimality conditions and improved convergence results for regularization methods for cardinality-constrained optimization problems, Restricted Robinson constraint qualification and optimality for cardinality-constrained cone programming, Efficient projected gradient methods for cardinality constrained optimization, Approximately normalized iterative hard thresholding for nonlinear compressive sensing, A gradient projection algorithm with a new stepsize for nonnegative sparsity-constrained optimization, Lagrangian duality and saddle points for sparse linear programming, Tractable ADMM schemes for computing KKT points and local minimizers for \(\ell_0\)-minimization problems, Convergent inexact penalty decomposition methods for cardinality-constrained problems, An extended Newton-type algorithm for \(\ell_2\)-regularized sparse logistic regression and its efficiency for classifying large-scale datasets, A continuous relaxation of the constrained \(\ell_2-\ell_0\) problem, An effective procedure for feature subset selection in logistic regression based on information criteria, Sequential optimality conditions for cardinality-constrained optimization problems with applications, On DC based methods for phase retrieval, Provably optimal sparse solutions to overdetermined linear systems with non-negativity constraints in a least-squares sense by implicit enumeration, A Lagrange-Newton algorithm for sparse nonlinear programming, Sparse regression at scale: branch-and-bound rooted in first-order optimization, Gradient projection Newton algorithm for sparse collaborative learning using synthetic and real datasets of applications, Adaptive iterative hard thresholding for least absolute deviation problems with sparsity constraints, On nondegenerate M-stationary points for sparsity constrained nonlinear optimization, Quaternion matrix optimization: motivation and analysis, An interior stochastic gradient method for a class of non-Lipschitz optimization problems, Gradient projection Newton pursuit for sparsity constrained optimization, Inexact version of Bregman proximal gradient algorithm, New insights on the optimality conditions of the \(\ell_2-\ell_0\) minimization problem, A proximal gradient method for control problems with non-smooth and non-convex control cost, On the weak stationarity conditions for mathematical programs with cardinality constraints: a unified approach, Matrix optimization over low-rank spectral sets: stationary points and local and global minimizers, Nonsmooth sparsity constrained optimization problems: optimality conditions, Optimality conditions for rank-constrained matrix optimization, Solving equations of random convex functions via anchored regression, Optimization problems involving group sparsity terms, Greedy approximation in convex optimization, Finding sparse solutions of systems of polynomial equations via group-sparsity optimization, Optimality conditions for sparse nonlinear programming, A preconditioned conjugate gradient method with active set strategy for \(\ell_1\)-regularized least squares, Phase retrieval: stability and recovery guarantees, Newton method for \(\ell_0\)-regularized optimization, A greedy Newton-type method for multiple sparse constraint problem, Morozov's discrepancy principle for \(\alpha\ell_1-\beta\ell_2\) sparsity regularization, Duality and Convex Programming, On the Minimization Over Sparse Symmetric Sets: Projections, Optimality Conditions, and Algorithms, Iterative Hard-Thresholding Applied to Optimal Control Problems with $L^0(\Omega)$ Control Cost