scientific article; zbMATH DE number 7370529
From MaRDI portal
Publication:4998877
Sheng-Long Zhou, Nai-Hua Xiu, Hou-Duo Qi
Publication date: 9 July 2021
Full work available at URL: https://arxiv.org/abs/1901.02763
Title: zbMATH Open Web Interface contents unavailable due to conflicting licenses.
global convergenceNewton's methodstationary pointsparse optimizationhard thresholdingquadratic convergence rate
Related Items
Quaternion matrix optimization: motivation and analysis, Gradient projection Newton pursuit for sparsity constrained optimization, Newton method for \(\ell_0\)-regularized optimization, Improved RIP-based bounds for guaranteed performance of two compressed sensing algorithms, Newton-type optimal thresholding algorithms for sparse optimization problems, A greedy Newton-type method for multiple sparse constraint problem, Partial gradient optimal thresholding algorithms for a class of sparse optimization problems, A Lagrange-Newton algorithm for sparse nonlinear programming, Unnamed Item, Newton Hard-Thresholding Pursuit for Sparse Linear Complementarity Problem via A New Merit Function, Gradient projection Newton algorithm for sparse collaborative learning using synthetic and real datasets of applications, Quadratic Convergence of Smoothing Newton's Method for 0/1 Loss Optimization
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Iterative hard thresholding for compressed sensing
- CoSaMP: Iterative signal recovery from incomplete and inaccurate samples
- A nonsmooth inexact Newton method for the solution of large-scale nonlinear complementarity problems
- A semismooth equation approach to the solution of nonlinear complementarity problems
- A feasible semismooth asymptotically Newton method for mixed complementarity problems
- Minimization of \(SC^ 1\) functions and the Maratos effect
- On the Minimization Over Sparse Symmetric Sets: Projections, Optimality Conditions, and Algorithms
- Sparsity Constrained Nonlinear Optimization: Optimality Conditions and Algorithms
- Learning Model-Based Sparsity via Projected Gradient Descent
- Compressed Sensing With Nonlinear Observations and Related Nonlinear Optimization Problems
- A Newton-CG Augmented Lagrangian Method for Semidefinite Programming
- Trading Accuracy for Sparsity in Optimization Problems with Sparsity Constraints
- Hard Thresholding Pursuit: An Algorithm for Compressive Sensing
- Computing a Trust Region Step
- A Quadratically Convergent Newton Method for Computing the Nearest Correlation Matrix
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- Decoding by Linear Programming
- Signal Recovery From Random Measurements Via Orthogonal Matching Pursuit
- Sparse and Redundant Representations
- Numerical Optimization
- Solving Karush--Kuhn--Tucker Systems via the Trust Region and the Conjugate Gradient Methods
- A Tight Bound of Hard Thresholding
- Gradient Pursuits
- Sparse Optimization Theory and Methods
- A null-space-based weightedl1minimization approach to compressed sensing
- Subspace Pursuit for Compressive Sensing Signal Reconstruction
- Minimization of $\ell_{1-2}$ for Compressed Sensing
- Greedy Sparsity-Constrained Optimization
- Sparse Approximation via Penalty Decomposition Methods
- Compressed sensing
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers