A Semismooth Newton Method with Multidimensional Filter Globalization for $l_1$-Optimization
From MaRDI portal
Publication:4979870
DOI10.1137/120892167zbMath1295.49022OpenAlexW2011655966MaRDI QIDQ4979870
Publication date: 19 June 2014
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/120892167
global convergencefixed-point methodsemismooth Newton methodmultidimensional filter\(l_1\)-optimization
Numerical mathematical programming methods (65K05) Large-scale problems in mathematical programming (90C06) Applications of mathematical programming (90C90) Newton-type methods (49M15) Large-scale systems (93A15)
Related Items (35)
A Trust-region Method for Nonsmooth Nonconvex Optimization ⋮ Further properties of the forward-backward envelope with applications to difference-of-convex programming ⋮ A line search filter-SQP method with Lagrangian function for nonlinear inequality constrained optimization ⋮ An inexact successive quadratic approximation method for L-1 regularized optimization ⋮ Second order semi-smooth proximal Newton methods in Hilbert spaces ⋮ A family of second-order methods for convex \(\ell _1\)-regularized optimization ⋮ An algorithm for quadratic ℓ1-regularized optimization with a flexible active-set strategy ⋮ Composite Difference-Max Programs for Modern Statistical Estimation Problems ⋮ An active set Newton-CG method for \(\ell_1\) optimization ⋮ A regularized semi-smooth Newton method with projection steps for composite convex programs ⋮ An Iterative Reduction FISTA Algorithm for Large-Scale LASSO ⋮ A new approach for solving nonlinear algebraic systems with complementarity conditions. Application to compositional multiphase equilibrium problems ⋮ A unified primal-dual algorithm framework for inequality constrained problems ⋮ An inexact quasi-Newton algorithm for large-scale \(\ell_1\) optimization with box constraints ⋮ Concave Likelihood-Based Regression with Finite-Support Response Variables ⋮ Exact recovery of sparse multiple measurement vectors by \(l_{2,p}\)-minimization ⋮ Randomized Block Proximal Damped Newton Method for Composite Self-Concordant Minimization ⋮ LASSO Reloaded: A Variational Analysis Perspective with Applications to Compressed Sensing ⋮ Analysis of Highly Accurate Finite Element Based Algorithms for Computing Distances to Level Sets ⋮ Numerical analysis of sparse initial data identification for parabolic problems ⋮ A Highly Efficient Semismooth Newton Augmented Lagrangian Method for Solving Lasso Problems ⋮ Finite element error analysis for measure-valued optimal control problems governed by a 1D wave equation with variable coefficients ⋮ Numerical reduced variable optimization methods via implicit functional dependence with applications ⋮ An active-set proximal-Newton algorithm for \(\ell_1\) regularized optimization problems with box constraints ⋮ Globalized inexact proximal Newton-type methods for nonconvex composite functions ⋮ An Efficient Proximal Block Coordinate Homotopy Method for Large-Scale Sparse Least Squares Problems ⋮ Search Direction Correction with Normalized Gradient Makes First-Order Methods Faster ⋮ Generalized Conjugate Gradient Methods for ℓ1 Regularized Convex Quadratic Programming with Finite Convergence ⋮ A Stochastic Semismooth Newton Method for Nonsmooth Nonconvex Optimization ⋮ On the local convergence of a stochastic semismooth Newton method for nonsmooth nonconvex optimization ⋮ Inverse point source location with the Helmholtz equation on a bounded domain ⋮ Linear convergence of accelerated conditional gradient algorithms in spaces of measures ⋮ A sparse control approach to optimal sensor placement in PDE-constrained parameter estimation problems ⋮ An Inexact Semismooth Newton Method on Riemannian Manifolds with Application to Duality-Based Total Variation Denoising ⋮ An active-set proximal quasi-Newton algorithm for ℓ1-regularized minimization over a sphere constraint
This page was built for publication: A Semismooth Newton Method with Multidimensional Filter Globalization for $l_1$-Optimization