Active‐Set Newton Methods and Partial Smoothness
From MaRDI portal
Publication:5000651
DOI10.1287/MOOR.2020.1075zbMATH Open1471.90143arXiv1902.00724OpenAlexW3113160997MaRDI QIDQ5000651FDOQ5000651
Calvin J. S. Wylie, A. S. Lewis
Publication date: 15 July 2021
Published in: Mathematics of Operations Research (Search for Journal in Brave)
Abstract: Diverse optimization algorithms correctly identify, in finite time, intrinsic constraints that must be active at optimality. Analogous behavior extends beyond optimization to systems involving partly smooth operators, and in particular to variational inequalities over partly smooth sets. As in classical nonlinear programming, such active-set structure underlies the design of accelerated local algorithms of Newton type. We formalize this idea in broad generality as a simple linearization scheme for two intersecting manifolds.
Full work available at URL: https://arxiv.org/abs/1902.00724
Numerical optimization and variational techniques (65K10) Sensitivity, stability, parametric optimization (90C31) Numerical methods based on necessary conditions (49M05)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Variational Analysis
- Finite-Dimensional Variational Inequalities and Complementarity Problems
- Nonsmooth equations in optimization. Regularity, calculus, methods and applications
- Implicit Functions and Solution Mappings
- Projected gradient methods for linearly constrained problems
- Local analysis of Newton-type methods for variational inequalities and nonlinear programming
- A proximal method for composite minimization
- On the Accurate Identification of Active Constraints
- Finite termination of the proximal point algorithm
- On the Identification of Active Constraints
- Computing proximal points of nonconvex functions
- Newton-Type Methods for Optimization and Variational Problems
- Practical Aspects of the Moreau--Yosida Regularization: Theoretical Preliminaries
- A \(\mathcal{VU}\)-algorithm for convex minimization
- Newton methods for nonsmooth convex minimization: connections among \(\mathcal U\)-Lagrangian, Riemannian Newton and SQP methods
- Identifiable Surfaces in Constrained Optimization
- On the Identification of Active Constraints II: The Nonconvex Case
- The 𝒰-Lagrangian of a convex function
- Active Sets, Nonsmoothness, and Sensitivity
- Activity Identification and Local Linear Convergence of Douglas–Rachford/ADMM under Partial Smoothness
- Primal-Dual Gradient Structured Functions: Second-Order Results; Links to Epi-Derivatives and Partly Smooth Functions
- 𝒱𝒰-smoothness and proximal point results for some nonconvex functions
- Partial Smoothness, Tilt Stability, and Generalized Hessians
- On finite convergence and constraint identification of subgradient projection methods
- On the convergence of projected gradient processes to singular critical points
- Optimality, identifiability, and sensitivity
- Finite convergence of algorithms for nonlinear programs and variational inequalities
- Activity Identification and Local Linear Convergence of Forward--Backward-type Methods
- Sensitivity Analysis for Mirror-Stratifiable Convex Functions
- Model Consistency of Partly Smooth Regularizers
- Newton-type methods: a broader view
- The degrees of freedom of partly smooth regularizers
- Local linear convergence analysis of Primal–Dual splitting methods
Cited In (2)
This page was built for publication: Active‐Set Newton Methods and Partial Smoothness
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5000651)