Abstract: Around a solution of an optimization problem, an "identifiable" subset of the feasible region is one containing all nearby solutions after small perturbations to the problem. A quest for only the most essential ingredients of sensitivity analysis leads us to consider identifiable sets that are "minimal". This new notion lays a broad and intuitive variational-analytic foundation for optimality conditions, sensitivity, and active set methods.
Recommendations
- scientific article; zbMATH DE number 3924655
- Observability and optimality
- Generalized sensitivities and optimal experimental design
- Optimal estimation of parameters
- scientific article; zbMATH DE number 27341
- scientific article; zbMATH DE number 4098154
- Optimal uncertainty quantification
- scientific article; zbMATH DE number 7578285
- scientific article; zbMATH DE number 4111822
Cites work
- scientific article; zbMATH DE number 1113627 (Why is no real title available?)
- scientific article; zbMATH DE number 1502618 (Why is no real title available?)
- A \(\mathcal{VU}\)-algorithm for convex minimization
- Active Sets, Nonsmoothness, and Sensitivity
- Amenable functions in optimization
- An Efficient Primal-Dual Interior-Point Method for Minimizing a Sum of Euclidean Norms
- Benchmark of some nonsmooth optimization solvers for computing nonconvex proximal points
- Computing proximal points of nonconvex functions
- Finite convergence of algorithms for nonlinear programs and variational inequalities
- Finite termination of the proximal point algorithm
- Finite-Dimensional Variational Inequalities and Complementarity Problems
- Geometric categories and o-minimal structures
- Identifiable Surfaces in Constrained Optimization
- Implicit Functions and Solution Mappings
- Local differentiability of distance functions
- Manifold identification in dual averaging for regularized stochastic online learning
- On finite convergence and constraint identification of subgradient projection methods
- On the Identification of Active Constraints
- On the Identification of Active Constraints II: The Nonconvex Case
- On the convergence of projected gradient processes to singular critical points
- Optimality, identifiability, and sensitivity
- Orthogonal invariance and identifiability
- Partial Smoothness, Tilt Stability, and Generalized Hessians
- Primal-Dual Gradient Structured Functions: Second-Order Results; Links to Epi-Derivatives and Partly Smooth Functions
- Projected gradient methods for linearly constrained problems
- Prox-regular functions in variational analysis
- Some continuity properties of polyhedral multifunctions
- Techniques of variational analysis
- Variational Analysis
- Variational Analysis and Generalized Differentiation I
- 𝒱𝒰-smoothness and proximal point results for some nonconvex functions
Cited in
(13)- On the interplay between acceleration and identification for the proximal gradient algorithm
- Proximal methods avoid active strict saddles of weakly convex functions
- Sensitivity analysis for mirror-stratifiable convex functions
- Partial smoothness and constant rank
- Newton acceleration on manifolds identified by proximal gradient methods
- Proximal gradient methods with adaptive subspace sampling
- Active-set Newton methods and partial smoothness
- Generic minimizing behavior in semialgebraic optimization
- Partial smoothness of the numerical radius at matrices whose fields of values are disks
- Orthogonal invariance and identifiability
- Infeasibility Detection with Primal-Dual Hybrid Gradient for Large-Scale Linear Programming
- Asymptotic normality and optimality in nonsmooth stochastic approximation
- Optimality, identifiability, and sensitivity
This page was built for publication: Optimality, identifiability, and sensitivity
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q463741)