Active Sets, Nonsmoothness, and Sensitivity

From MaRDI portal
Publication:4785904

DOI10.1137/S1052623401387623zbMath1055.90072OpenAlexW2130025383MaRDI QIDQ4785904

Adrian S. Lewis

Publication date: 5 January 2003

Published in: SIAM Journal on Optimization (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1137/s1052623401387623



Related Items

A Trust-region Method for Nonsmooth Nonconvex Optimization, Survey Descent: A Multipoint Generalization of Gradient Descent for Nonsmooth Optimization, Eventual linear convergence of the Douglas-Rachford iteration for basis pursuit, A proximal method for identifying active manifolds, A proximal method for composite minimization, Proximal methods avoid active strict saddles of weakly convex functions, Sensitivity Analysis for Mirror-Stratifiable Convex Functions, A Sard theorem for tame set-valued mappings, Partial Smoothness and Constant Rank, Local convergence properties of Douglas-Rachford and alternating direction method of multipliers, Local linear convergence of a primal-dual algorithm for the augmented convex models, Composite proximal bundle method, A space decomposition scheme for maximum eigenvalue functions and its applications, Nonsmooth optimization via quasi-Newton methods, The degrees of freedom of partly smooth regularizers, Quadratic growth conditions and uniqueness of optimal solution to Lasso, On partial smoothness, tilt stability and the \({\mathcal {VU}}\)-decomposition, Forward-Backward Envelope for the Sum of Two Nonconvex Functions: Further Properties and Nonmonotone Linesearch Algorithms, Newton acceleration on manifolds identified by proximal gradient methods, Harnessing Structure in Composite Nonsmooth Minimization, The \(\mathcal{U}\)-Lagrangian, fast track, and partial smoothness of a prox-regular function, LASSO Reloaded: A Variational Analysis Perspective with Applications to Compressed Sensing, Role of subgradients in variational analysis of polyhedral functions, Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods, Convergence of the forward-backward algorithm: beyond the worst-case with the help of geometry, Accelerating inexact successive quadratic approximation for regularized optimization through manifold identification, Local linear convergence of proximal coordinate descent algorithm, Infeasibility Detection with Primal-Dual Hybrid Gradient for Large-Scale Linear Programming, Model selection with low complexity priors, A Nonsmooth Trust-Region Method for Locally Lipschitz Functions with Application to Optimization Problems Constrained by Variational Inequalities, Activity Identification and Local Linear Convergence of Forward--Backward-type Methods, Critical values of set-valued maps with stratifiable graphs. Extensions of Sard and Smale-Sard theorems, Optimality, identifiability, and sensitivity, On Nesterov's nonsmooth Chebyshev-Rosenbrock functions, Relaxed Gauss--Newton Methods with Applications to Electrical Impedance Tomography, Activity Identification and Local Linear Convergence of Douglas–Rachford/ADMM under Partial Smoothness, Partial Smoothness of the Numerical Radius at Matrices Whose Fields of Values are Disks, A fast gradient and function sampling method for finite-max functions, đ’±đ’°-smoothness and proximal point results for some nonconvex functions, An approximate decomposition algorithm for convex minimization, Local linear convergence analysis of Primal–Dual splitting methods, The chain rule for VU-decompositions of nonsmooth functions, On the interplay between acceleration and identification for the proximal gradient algorithm, Generic Minimizing Behavior in Semialgebraic Optimization, Calculus of the exponent of Kurdyka-Ɓojasiewicz inequality and its applications to linear convergence of first-order methods, Functions and sets of smooth substructure: relationships and examples, Low Complexity Regularization of Linear Inverse Problems, An inequality constrained nonlinear Kalman-Bucy smoother by interior point likelihood maximization, Majorization-Minimization Procedures and Convergence of SQP Methods for Semi-Algebraic and Tame Programs, New active set identification for general constrained optimization and minimax problems, Strong Metric (Sub)regularity of Karush–Kuhn–Tucker Mappings for Piecewise Linear-Quadratic Convex-Composite Optimization and the Quadratic Convergence of Newton’s Method, First- and second-order optimality conditions for piecewise smooth objective functions, Characterizing and Testing Subdifferential Regularity in Piecewise Smooth Optimization, A Decomposition Algorithm for the Sums of the Largest Eigenvalues, Convergence rates of forward-Douglas-Rachford splitting method, Active‐Set Newton Methods and Partial Smoothness, Geometrical interpretation of the predictor-corrector type algorithms in structured optimization problems, Newton methods for nonsmooth convex minimization: connections among \(\mathcal U\)-Lagrangian, Riemannian Newton and SQP methods, Nonsmoothness and a variable metric method, A BFGS-SQP method for nonsmooth, nonconvex, constrained optimization and its evaluation using relative minimization profiles, Proximal Gradient Methods with Adaptive Subspace Sampling