Greedy sparsity-constrained optimization
From MaRDI portal
Learning and adaptive systems in artificial intelligence (68T05) Generalized linear models (logistic models) (62J12) Large-scale problems in mathematical programming (90C06) Approximation methods and heuristics in mathematical programming (90C59) Signal theory (characterization, reconstruction, filtering, etc.) (94A12) Parametric inference under constraints (62F30) Image processing (compression, reconstruction, etc.) in information and communication theory (94A08)
Abstract: Sparsity-constrained optimization has wide applicability in machine learning, statistics, and signal processing problems such as feature selection and compressive Sensing. A vast body of work has studied the sparsity-constrained optimization from theoretical, algorithmic, and application aspects in the context of sparse estimation in linear models where the fidelity of the estimate is measured by the squared error. In contrast, relatively less effort has been made in the study of sparsity-constrained optimization in cases where nonlinear models are involved or the cost function is not quadratic. In this paper we propose a greedy algorithm, Gradient Support Pursuit (GraSP), to approximate sparse minima of cost functions of arbitrary form. Should a cost function have a Stable Restricted Hessian (SRH) or a Stable Restricted Linearization (SRL), both of which are introduced in this paper, our algorithm is guaranteed to produce a sparse vector within a bounded distance from the true sparse optimum. Our approach generalizes known results for quadratic cost functions that arise in sparse linear regression and Compressive Sensing. We also evaluate the performance of GraSP through numerical simulations on synthetic data, where the algorithm is employed for sparse logistic regression with and without -regularization.
Recommendations
- Algorithms for sparsity-constrained optimization
- Sparsity constrained nonlinear optimization: optimality conditions and algorithms
- A gradient projection algorithm with a new stepsize for nonnegative sparsity-constrained optimization
- scientific article; zbMATH DE number 6982922
- Sparse Optimization with Least-Squares Constraints
Cited in
(46)- Error bounds for rank constrained optimization problems and applications
- Newton method for \(\ell_0\)-regularized optimization
- Greedy approximation in convex optimization
- On solutions of sparsity constrained optimization
- The first-order necessary conditions for sparsity constrained optimization
- Gradient projection Newton algorithm for sparse collaborative learning using synthetic and real datasets of applications
- Restricted strong convexity implies weak submodularity
- Gradient projection Newton pursuit for sparsity constrained optimization
- Efficient projected gradient methods for cardinality constrained optimization
- Iterative hard thresholding methods for \(l_0\) regularized convex cone programming
- Generalized conditional gradient for sparse estimation
- A unifying framework for sparsity-constrained optimization
- A gradient projection algorithm with a new stepsize for nonnegative sparsity-constrained optimization
- Nonsmooth sparsity constrained optimization problems: optimality conditions
- Cardinality minimization, constraints, and regularization: a survey
- Double fused Lasso regularized regression with both matrix and vector valued predictors
- Sparse Optimization with Least-Squares Constraints
- Solving equations of random convex functions via anchored regression
- A tight bound of hard thresholding
- Optimality conditions for rank-constrained matrix optimization
- Statistical computational learning
- scientific article; zbMATH DE number 7415078 (Why is no real title available?)
- A local MM subspace method for solving constrained variational problems in image recovery
- scientific article; zbMATH DE number 6982922 (Why is no real title available?)
- scientific article; zbMATH DE number 7370529 (Why is no real title available?)
- scientific article; zbMATH DE number 7370639 (Why is no real title available?)
- An extended Newton-type algorithm for \(\ell_2\)-regularized sparse logistic regression and its efficiency for classifying large-scale datasets
- Subspace quadratic regularization method for group sparse multinomial logistic regression
- Trading accuracy for sparsity in optimization problems with sparsity constraints
- A data-driven line search rule for support recovery in high-dimensional data analysis
- A bilinear formulation for vector sparsity optimization
- Sparsity constrained nonlinear optimization: optimality conditions and algorithms
- On the minimization over sparse symmetric sets: projections, optimality conditions, and algorithms
- \texttt{skscope}: fast sparsity-constrained optimization in Python
- Algorithms for sparsity-constrained optimization
- Relaxation quadratic approximation greedy pursuit method based on sparse learning
- Conditional gradient algorithms for rank-one matrix approximations with a sparsity constraint
- The greedy simplex algorithm for double sparsity constrained optimization problems
- A survey on compressive sensing: classical results and recent advancements
- Stochastic privacy-preserving methods for nonconvex sparse learning
- Nonlinear Variable Selection via Deep Neural Networks
- Weighted thresholding homotopy method for sparsity constrained optimization
- Sparse Wasserstein barycenters and application to reduced order modeling
- scientific article; zbMATH DE number 7306858 (Why is no real title available?)
- A greedy Newton-type method for multiple sparse constraint problem
- Generalized greedy alternatives
This page was built for publication: Greedy sparsity-constrained optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5405269)