Pages that link to "Item:Q2288192"
From MaRDI portal
The following pages link to An efficient Hessian based algorithm for solving large-scale sparse group Lasso problems (Q2288192):
Displaying 28 items.
- A dual based semismooth Newton-type algorithm for solving large-scale sparse Tikhonov regularization problems (Q2033078) (← links)
- An efficient Hessian based algorithm for singly linearly and box constrained least squares regression (Q2049105) (← links)
- A semismooth Newton-based augmented Lagrangian algorithm for density matrix least squares problems (Q2095559) (← links)
- An investigation on semismooth Newton based augmented Lagrangian method for image restoration (Q2162237) (← links)
- An efficient augmented Lagrangian method with semismooth Newton solver for total generalized variation (Q2697370) (← links)
- (Q4998873) (← links)
- An Efficient Linearly Convergent Regularized Proximal Point Algorithm for Fused Multiple Graphical Lasso Problems (Q4999369) (← links)
- Efficient Sparse Hessian-Based Semismooth Newton Algorithms for Dantzig Selector (Q5021412) (← links)
- A Subspace Acceleration Method for Minimization Involving a Group Sparsity-Inducing Regularizer (Q5072590) (← links)
- Difference-of-Convex Algorithms for a Class of Sparse Group $\ell_0$ Regularized Optimization Problems (Q5093646) (← links)
- A Proximal Point Dual Newton Algorithm for Solving Group Graphical Lasso Problems (Q5116554) (← links)
- An Asymptotically Superlinearly Convergent Semismooth Newton Augmented Lagrangian Method for Linear Programming (Q5124003) (← links)
- The Linear and Asymptotically Superlinear Convergence Rates of the Augmented Lagrangian Method with a Practical Relative Error Criterion (Q5149515) (← links)
- Iteratively Reweighted Group Lasso Based on Log-Composite Regularization (Q5161764) (← links)
- Solving the OSCAR and SLOPE Models Using a Semismooth Newton-Based Augmented Lagrangian Method (Q5214191) (← links)
- (Q5214258) (← links)
- Efficient Sparse Semismooth Newton Methods for the Clustered Lasso Problem (Q5231697) (← links)
- A dual-based stochastic inexact algorithm for a class of stochastic nonsmooth convex composite problems (Q6051310) (← links)
- Group linear algorithm with sparse principal decomposition: a variable selection and clustering method for generalized linear models (Q6099122) (← links)
- Linearly-convergent FISTA variant for composite optimization with duality (Q6101606) (← links)
- A dual semismooth Newton based augmented Lagrangian method for large-scale linearly constrained sparse group square-root Lasso problems (Q6111345) (← links)
- Newton-type methods with the proximal gradient step for sparse estimation (Q6130698) (← links)
- A semismooth Newton stochastic proximal point algorithm with variance reduction (Q6490309) (← links)
- A Corrected Inexact Proximal Augmented Lagrangian Method with a Relative Error Criterion for a Class of Group-Quadratic Regularized Optimal Transport Problems (Q6500198) (← links)
- An efficient sieving-based secant method for sparse optimization problems with least-squares constraints (Q6561379) (← links)
- Continuous exact relaxation and alternating proximal gradient algorithm for partial sparse and partial group sparse optimization problems (Q6569683) (← links)
- A generalized formulation for group selection via ADMM (Q6571367) (← links)
- Smoothing composite proximal gradient algorithm for sparse group Lasso problems with nonsmooth loss functions (Q6584749) (← links)