Identifiable Surfaces in Constrained Optimization
From MaRDI portal
Publication:3138083
Recommendations
- scientific article; zbMATH DE number 2155014
- Exposing Constraints
- Gradient projection method for optimization problems with a constraint in the form of the intersection of a smooth surface and a convex closed set
- On the Identification of Active Constraints
- On the Identification Property of a Projected Gradient Method
Cited in
(42)- Active-set identification with complexity guarantees of an almost cyclic 2-coordinate descent method with Armijo line search
- Distributed Learning with Sparse Communications by Identification
- Geometrical interpretation of the predictor-corrector type algorithms in structured optimization problems
- A proximal method for composite minimization
- Nonsmoothness in machine learning: specific structure, proximal identification, and applications
- A proximal method for identifying active manifolds
- An approximate decomposition algorithm for convex minimization
- The degrees of freedom of partly smooth regularizers
- Local linear convergence analysis of primal-dual splitting methods
- Low complexity regularization of linear inverse problems
- Activity identification and local linear convergence of forward-backward-type methods
- Error bounds in mathematical programming
- Newton acceleration on manifolds identified by proximal gradient methods
- Stochastic algorithms with geometric step decay converge linearly on sharp functions
- Proximal methods avoid active strict saddles of weakly convex functions
- ``Active-set complexity of proximal gradient: how long does it take to find the sparsity pattern?
- Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods
- A Trust-region Method for Nonsmooth Nonconvex Optimization
- Generic minimizing behavior in semialgebraic optimization
- Model selection with low complexity priors
- Exposing Constraints
- Optimality, identifiability, and sensitivity
- First-order Methods for the Impatient: Support Identification in Finite Time with Convergent Frank--Wolfe Variants
- Local linear convergence of proximal coordinate descent algorithm
- New active set identification for general constrained optimization and minimax problems
- Asymptotic properties of dual averaging algorithm for constrained distributed stochastic optimization
- Asymptotic normality and optimality in nonsmooth stochastic approximation
- Proximal gradient methods with adaptive subspace sampling
- Functions and sets of smooth substructure: relationships and examples
- Convergence of the forward-backward algorithm: beyond the worst-case with the help of geometry
- Infeasibility Detection with Primal-Dual Hybrid Gradient for Large-Scale Linear Programming
- An inequality constrained nonlinear Kalman-Bucy smoother by interior point likelihood maximization
- Local convergence properties of Douglas-Rachford and alternating direction method of multipliers
- Linear convergence analysis of the use of gradient projection methods on total variation problems
- 𝒱𝒰-smoothness and proximal point results for some nonconvex functions
- Derivative-free optimization methods for finite minimax problems
- Active-set Newton methods and partial smoothness
- Active set complexity of the away-step Frank-Wolfe algorithm
- Asymptotic optimality in stochastic optimization
- On the Identification of Active Constraints
- On partial smoothness, tilt stability and the \({\mathcal {VU}}\)-decomposition
- Partial smoothness and constant rank
This page was built for publication: Identifiable Surfaces in Constrained Optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3138083)