Active-set Newton methods and partial smoothness
From MaRDI portal
Publication:5000651
DOI10.1287/MOOR.2020.1075zbMATH Open1471.90143OpenAlexW3113160997MaRDI QIDQ5000651FDOQ5000651
Authors: Calvin J. S. Wylie, A. S. Lewis
Publication date: 15 July 2021
Published in: Mathematics of Operations Research (Search for Journal in Brave)
Abstract: Diverse optimization algorithms correctly identify, in finite time, intrinsic constraints that must be active at optimality. Analogous behavior extends beyond optimization to systems involving partly smooth operators, and in particular to variational inequalities over partly smooth sets. As in classical nonlinear programming, such active-set structure underlies the design of accelerated local algorithms of Newton type. We formalize this idea in broad generality as a simple linearization scheme for two intersecting manifolds.
Full work available at URL: https://arxiv.org/abs/1902.00724
Recommendations
Numerical optimization and variational techniques (65K10) Sensitivity, stability, parametric optimization (90C31) Numerical methods based on necessary conditions (49M05)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- A \(\mathcal{VU}\)-algorithm for convex minimization
- A proximal method for composite minimization
- Active Sets, Nonsmoothness, and Sensitivity
- Activity identification and local linear convergence of Douglas-Rachford/ADMM under partial smoothness
- Activity identification and local linear convergence of forward-backward-type methods
- Computing proximal points of nonconvex functions
- Finite convergence of algorithms for nonlinear programs and variational inequalities
- Finite termination of the proximal point algorithm
- Finite-Dimensional Variational Inequalities and Complementarity Problems
- Identifiable Surfaces in Constrained Optimization
- Implicit Functions and Solution Mappings
- Local analysis of Newton-type methods for variational inequalities and nonlinear programming
- Local linear convergence analysis of primal-dual splitting methods
- Manifold identification in dual averaging for regularized stochastic online learning
- Model Consistency of Partly Smooth Regularizers
- Newton methods for nonsmooth convex minimization: connections among \(\mathcal U\)-Lagrangian, Riemannian Newton and SQP methods
- Newton-Type Methods for Optimization and Variational Problems
- Newton-type methods: a broader view
- Nonsmooth equations in optimization. Regularity, calculus, methods and applications
- On finite convergence and constraint identification of subgradient projection methods
- On the Accurate Identification of Active Constraints
- On the Identification of Active Constraints
- On the Identification of Active Constraints II: The Nonconvex Case
- On the convergence of projected gradient processes to singular critical points
- Optimality, identifiability, and sensitivity
- Partial Smoothness, Tilt Stability, and Generalized Hessians
- Practical Aspects of the Moreau--Yosida Regularization: Theoretical Preliminaries
- Primal-Dual Gradient Structured Functions: Second-Order Results; Links to Epi-Derivatives and Partly Smooth Functions
- Projected gradient methods for linearly constrained problems
- Sensitivity analysis for mirror-stratifiable convex functions
- The degrees of freedom of partly smooth regularizers
- The 𝒰-Lagrangian of a convex function
- Variational Analysis
- 𝒱𝒰-smoothness and proximal point results for some nonconvex functions
Cited In (3)
This page was built for publication: Active-set Newton methods and partial smoothness
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5000651)