A family of second-order methods for convex \(\ell _1\)-regularized optimization (Q312690): Difference between revisions

From MaRDI portal
Import240304020342 (talk | contribs)
Set profile property.
ReferenceBot (talk | contribs)
Changed an Item
 
(One intermediate revision by one other user not shown)
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.1007/s10107-015-0965-3 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W2287259016 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Fast Image Recovery Using Variable Splitting and Constrained Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Newton's method for linear complementarity problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Fast Algorithm for Sparse Reconstruction Based on Shrinkage, Subspace Optimization, and Continuation / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Nonconvergence of the plain Newton-min algorithm for linear complementarity problems with a \(P\)-matrix / rank
 
Normal rank
Property / cites work
 
Property / cites work: Primal-Dual Strategy for Constrained Optimal Control Problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Projected Newton Methods for Optimization Problems with Simple Constraints / rank
 
Normal rank
Property / cites work
 
Property / cites work: CUTE / rank
 
Normal rank
Property / cites work
 
Property / cites work: Sample size selection in optimization methods for machine learning / rank
 
Normal rank
Property / cites work
 
Property / cites work: An algorithm for quadratic ℓ<sub>1</sub>-regularized optimization with a flexible active-set strategy / rank
 
Normal rank
Property / cites work
 
Property / cites work: An inexact successive quadratic approximation method for L-1 regularized optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: A semismooth Newton method for Tikhonov functionals with sparsity constraints / rank
 
Normal rank
Property / cites work
 
Property / cites work: An iterative thresholding algorithm for linear inverse problems with a sparsity constraint / rank
 
Normal rank
Property / cites work
 
Property / cites work: Benchmarking optimization software with performance profiles. / rank
 
Normal rank
Property / cites work
 
Property / cites work: De-noising by soft-thresholding / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q2880998 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Finite-Dimensional Variational Inequalities and Complementarity Problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: CUTEr and SifDec / rank
 
Normal rank
Property / cites work
 
Property / cites work: Fixed-Point Continuation for $\ell_1$-Minimization: Methodology and Convergence / rank
 
Normal rank
Property / cites work
 
Property / cites work: Global convergence of damped semismooth Newton methods for ℓ <sub>1</sub> Tikhonov regularization / rank
 
Normal rank
Property / cites work
 
Property / cites work: A SQP-Semismooth Newton-type Algorithm applied to Control of the instationary Navier--Stokes System Subject to Control Constraints / rank
 
Normal rank
Property / cites work
 
Property / cites work: The Primal-Dual Active Set Strategy as a Semismooth Newton Method / rank
 
Normal rank
Property / cites work
 
Property / cites work: An Infeasible Primal-Dual Algorithm for Total Bounded Variation--Based Inf-Convolution-Type Image Restoration / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Feasible Active Set Method for Strictly Convex Quadratic Problems with Simple Bounds / rank
 
Normal rank
Property / cites work
 
Property / cites work: A block principal pivoting algorithm for large-scale strictly monotone linear complementarity problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Error bounds and convergence analysis of feasible descent methods: A general approach / rank
 
Normal rank
Property / cites work
 
Property / cites work: Semismooth and Semiconvex Functions in Constrained Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Semismooth Newton Method with Multidimensional Filter Globalization for $l_1$-Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Introductory lectures on convex optimization. A basic course. / rank
 
Normal rank
Property / cites work
 
Property / cites work: Numerical Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Convergence Analysis of Some Algorithms for Solving Nonsmooth Equations / rank
 
Normal rank
Property / cites work
 
Property / cites work: A nonsmooth version of Newton's method / rank
 
Normal rank
Property / cites work
 
Property / cites work: A short proof of finiteness of Murty's principal pivoting algorithm / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3376534 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Elliptic optimal control problems with \(L^1\)-control cost and applications for the placement of control devices / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5198964 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Sparse Reconstruction by Separable Approximation / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q2896156 / rank
 
Normal rank

Latest revision as of 13:50, 12 July 2024

scientific article
Language Label Description Also known as
English
A family of second-order methods for convex \(\ell _1\)-regularized optimization
scientific article

    Statements

    A family of second-order methods for convex \(\ell _1\)-regularized optimization (English)
    0 references
    0 references
    0 references
    0 references
    0 references
    16 September 2016
    0 references
    The paper deals with the minimization of a convex objective function augmented with an \(L_1\) regularization term, and to this end it proposes a new Block Active-Set (BAS) method which involves multiple changes in the active manifold estimate at every estimation step. The basic BAS method is firstly described, and the authors propose a new block active-set algorithm to rectify some failures caused by the definition of the matrix involved in the quadratic form of the function to optimize. Modified versions of the orthant method are proposed, and a numerical example illustrates the efficiency of the method compared with other ones (FISTA for example).
    0 references
    convex optimization
    0 references
    \(\ell _1\)-regularization
    0 references
    block active set method
    0 references
    semismooth Newton method
    0 references
    quadratic problems
    0 references
    second-order methods
    0 references
    orthant based method
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references