Sparsity Constrained Nonlinear Optimization: Optimality Conditions and Algorithms
From MaRDI portal
Publication:2866194
DOI10.1137/120869778zbMath1295.90051arXiv1203.4580OpenAlexW2083346837MaRDI QIDQ2866194
Publication date: 13 December 2013
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1203.4580
Nonconvex programming, global optimization (90C26) Optimality conditions and duality in mathematical programming (90C46)
Related Items
Critical point theory for sparse recovery, Optimality conditions for sparse nonlinear programming, Sparsity constrained optimization problems via disjunctive programming, A new conjugate gradient hard thresholding pursuit algorithm for sparse signal recovery, Quaternion matrix optimization: motivation and analysis, The sparse principal component analysis problem: optimality conditions and algorithms, Convergence of a Scholtes-type regularization method for cardinality-constrained optimization problems with an application in sparse robust portfolio optimization, A penalty decomposition approach for multi-objective cardinality-constrained optimization problems, A survey on compressive sensing: classical results and recent advancements, The non-convex sparse problem with nonnegative constraint for signal reconstruction, MIP-BOOST: Efficient and Effective L0 Feature Selection for Linear Regression, Constraint qualifications and optimality conditions for optimization problems with cardinality constraints, An interior stochastic gradient method for a class of non-Lipschitz optimization problems, The smoothing objective penalty function method for two-cardinality sparse constrained optimization problems, Gradient projection Newton pursuit for sparsity constrained optimization, Second-order optimality conditions and improved convergence results for regularization methods for cardinality-constrained optimization problems, First Order Methods Beyond Convexity and Lipschitz Gradient Continuity with Applications to Quadratic Inverse Problems, Newton method for \(\ell_0\)-regularized optimization, Restricted Robinson constraint qualification and optimality for cardinality-constrained cone programming, An augmented Lagrangian method for optimization problems with structured geometric constraints, Improved RIP-based bounds for guaranteed performance of two compressed sensing algorithms, Grouped variable selection with discrete optimization: computational and statistical perspectives, Doubly majorized algorithm for sparsity-inducing optimization problems with regularizer-compatible constraints, Sparse optimization via vector \(k\)-norm and DC programming with an application to feature selection for support vector machines, Solution sets of three sparse optimization problems for multivariate regression, A Bregman stochastic method for nonconvex nonsmooth problem beyond global Lipschitz gradient continuity, A unifying framework for sparsity-constrained optimization, A preconditioned conjugate gradient method with active set strategy for \(\ell_1\)-regularized least squares, Normal Cones Intersection Rule and Optimality Analysis for Low-Rank Matrix Optimization with Affine Manifolds, A wonderful triangle in compressed sensing, Unnamed Item, Inexact penalty decomposition methods for optimization problems with geometric constraints, Inexact version of Bregman proximal gradient algorithm, A Path-Based Approach to Constrained Sparse Optimization, Gradient-based method with active set strategy for $\ell _1$ optimization, New insights on the optimality conditions of the \(\ell_2-\ell_0\) minimization problem, Proximal Mapping for Symmetric Penalty and Sparsity, A greedy Newton-type method for multiple sparse constraint problem, Morozov's discrepancy principle for \(\alpha\ell_1-\beta\ell_2\) sparsity regularization, Unnamed Item, An Inexact Projected Gradient Method for Sparsity-Constrained Quadratic Measurements Regression, Phase retrieval: stability and recovery guarantees, Efficient projected gradient methods for cardinality constrained optimization, A proximal gradient method for control problems with non-smooth and non-convex control cost, On the weak stationarity conditions for mathematical programs with cardinality constraints: a unified approach, The Analysis of Alternating Minimization Method for Double Sparsity Constrained Optimization Problem, Nonlinear frames and sparse reconstructions in Banach spaces, A quadratic penalty method for hypergraph matching, Total variation reconstruction from quadratic measurements, A gradient projection algorithm with a new stepsize for nonnegative sparsity-constrained optimization, Lagrangian duality and saddle points for sparse linear programming, Convex Optimization and Parsimony of $L_p$-balls Representation, Mathematical Programs with Cardinality Constraints: Reformulation by Complementarity-Type Conditions and a Regularization Method, Unnamed Item, Duality and Convex Programming, Tractable ADMM schemes for computing KKT points and local minimizers for \(\ell_0\)-minimization problems, Least Sparsity of $p$-Norm Based Optimization Problems with $p>1$, Convergent inexact penalty decomposition methods for cardinality-constrained problems, An extended Newton-type algorithm for \(\ell_2\)-regularized sparse logistic regression and its efficiency for classifying large-scale datasets, Approximately normalized iterative hard thresholding for nonlinear compressive sensing, A continuous relaxation of the constrained \(\ell_2-\ell_0\) problem, On the Minimization Over Sparse Symmetric Sets: Projections, Optimality Conditions, and Algorithms, Optimal $k$-Thresholding Algorithms for Sparse Optimization Problems, An effective procedure for feature subset selection in logistic regression based on information criteria, Sequential optimality conditions for cardinality-constrained optimization problems with applications, On DC based methods for phase retrieval, Matrix optimization over low-rank spectral sets: stationary points and local and global minimizers, Nonconvex Lagrangian-Based Optimization: Monitoring Schemes and Global Convergence, Nonsmooth sparsity constrained optimization problems: optimality conditions, Optimality conditions for rank-constrained matrix optimization, Provably optimal sparse solutions to overdetermined linear systems with non-negativity constraints in a least-squares sense by implicit enumeration, Solving equations of random convex functions via anchored regression, Fast Best Subset Selection: Coordinate Descent and Local Combinatorial Optimization Algorithms, Unnamed Item, Optimization problems involving group sparsity terms, Iterative Hard-Thresholding Applied to Optimal Control Problems with $L^0(\Omega)$ Control Cost, Quasi-linear Compressed Sensing, A Lagrange-Newton algorithm for sparse nonlinear programming, Unnamed Item, Sparse regression at scale: branch-and-bound rooted in first-order optimization, Greedy approximation in convex optimization, Gradient projection Newton algorithm for sparse collaborative learning using synthetic and real datasets of applications, Quadratic Convergence of Smoothing Newton's Method for 0/1 Loss Optimization, Adaptive iterative hard thresholding for least absolute deviation problems with sparsity constraints, Finding sparse solutions of systems of polynomial equations via group-sparsity optimization, First-Order Algorithms for a Class of Fractional Optimization Problems, On nondegenerate M-stationary points for sparsity constrained nonlinear optimization, Nomonotone spectral gradient method for sparse recovery, On solutions of sparsity constrained optimization, The first-order necessary conditions for sparsity constrained optimization