A dual semismooth Newton based augmented Lagrangian method for large-scale linearly constrained sparse group square-root Lasso problems
From MaRDI portal
Publication:6111345
DOI10.1007/s10915-023-02271-wzbMath1522.90108arXiv2111.13878MaRDI QIDQ6111345
Publication date: 6 July 2023
Published in: Journal of Scientific Computing (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2111.13878
Numerical mathematical programming methods (65K05) Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Quadratic programming (90C20) Decomposition methods (49M27)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Variable selection in regression with compositional covariates
- Conic optimization via operator splitting and homogeneous self-dual embedding
- Regression analysis for microbiome compositional data
- On the linear convergence of a proximal gradient method for a class of nonsmooth convex minimization problems
- Generalized Hessian matrix and second-order optimality conditions for problems with \(C^{1,1}\) data
- Complementarity functions and numerical experiments on some smoothing Newton methods for second-order-cone complementarity problems
- An efficient Hessian based algorithm for solving large-scale sparse group Lasso problems
- Variational analysis of the Ky Fan \(k\)-norm
- A nonsmooth version of Newton's method
- Semismoothness of solutions to generalized equations and the Moreau-Yosida regularization
- On the monotonicity of the gradient of a convex function
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- Scaled sparse linear regression
- Asymptotic Convergence Analysis of the Proximal Point Algorithm
- Algorithms for Fitting the Constrained Lasso
- Implicit Functions and Solution Mappings
- Linear Inversion of Band-Limited Reflection Seismograms
- Some continuity properties of polyhedral multifunctions
- Monotone Operators and the Proximal Point Algorithm
- Augmented Lagrangians and Applications of the Proximal Point Algorithm in Convex Programming
- Variational Analysis
- A Highly Efficient Semismooth Newton Augmented Lagrangian Method for Solving Lasso Problems
- Sharp Oracle Inequalities for Square Root Regularization
- Finite-Dimensional Variational Inequalities and Complementarity Problems
- Robust Regression and Lasso
- The Group Square-Root Lasso: Theoretical Properties and Fast Algorithms
- Model Selection and Estimation in Regression with Grouped Variables
- Proximité et dualité dans un espace hilbertien
This page was built for publication: A dual semismooth Newton based augmented Lagrangian method for large-scale linearly constrained sparse group square-root Lasso problems