Inexact proximal DC Newton-type method for nonconvex composite functions
From MaRDI portal
Publication:6155059
DOI10.1007/s10589-023-00525-9arXiv2111.07618MaRDI QIDQ6155059
Yasushi Narushima, Hiroshi Yabe, Shummin Nakayama
Publication date: 16 February 2024
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2111.07618
nonsmooth optimizationsemi-smooth Newton methodmemoryless quasi-Newton methodinexact proximal Newton-type methodproximal DC algorithm
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Nearly unbiased variable selection under minimax concave penalty
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- An inexact successive quadratic approximation method for L-1 regularized optimization
- Practical inexact proximal quasi-Newton method with global complexity analysis
- Inexact proximal Newton methods for self-concordant functions
- Enhancing sparsity by reweighted \(\ell _{1}\) minimization
- Convex analysis approach to d. c. programming: Theory, algorithms and applications
- A regularized semi-smooth Newton method with projection steps for composite convex programs
- A proximal difference-of-convex algorithm with extrapolation
- DC formulations and algorithms for sparse optimization problems
- Templates for convex cone problems with applications to sparse signal recovery
- A new look at smoothing Newton methods for nonlinear complementarity problems and box constrained variational inequalities
- Inexact proximal memoryless quasi-Newton methods based on the Broyden family for minimizing composite functions
- An inexact successive quadratic approximation method for a class of difference-of-convex optimization problems
- On the superiority of PGMs to PDCAs in nonsmooth nonconvex sparse regression
- Memoryless quasi-Newton methods based on spectral-scaling Broyden family for unconstrained optimization
- A nonsmooth version of Newton's method
- Inexact successive quadratic approximation for regularized optimization
- Optimization theory and methods. Nonlinear programming
- Proximal Splitting Methods in Signal Processing
- Proximal Newton-Type Methods for Minimizing Composite Functions
- Updating Quasi-Newton Matrices with Limited Storage
- A generalized proximal point algorithm for certain non-convex minimization problems
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- First-Order Methods in Optimization
- Convergence Analysis of Some Algorithms for Solving Nonsmooth Equations
- Finite-Dimensional Variational Inequalities and Complementarity Problems
- Finite-Dimensional Variational Inequalities and Complementarity Problems
- Sparse Recovery via Partial Regularization: Models, Theory, and Algorithms
- On Quasi-Newton Forward-Backward Splitting: Proximal Calculus and Convergence
- Minimization of $\ell_{1-2}$ for Compressed Sensing
- A modified BFGS method and its global convergence in nonconvex minimization