Convergence analysis of perturbed feasible descent methods
From MaRDI portal
Publication:1379956
DOI10.1023/A:1022602123316zbMATH Open0899.90149OpenAlexW1524647815MaRDI QIDQ1379956FDOQ1379956
Authors: M. V. Solodov
Publication date: 1997
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1023/a:1022602123316
Recommendations
- Error stability properties of generalized gradient-type algorithms
- Error bounds and convergence analysis of feasible descent methods: A general approach
- Convergence analysis of perturbed gradient methods and hybrid projection methods
- Convergence property of gradient-type methods with non-monotone line search in the presence of perturbations
- Convergence of algorithms for perturbed optimization problems
Cites Work
- Convex Analysis
- Some continuity properties of polyhedral multifunctions
- Monotone Operators and the Proximal Point Algorithm
- Convex programming in Hilbert space
- Incremental gradient algorithms with stepsizes bounded away from zero
- Title not available (Why is that?)
- An Incremental Gradient(-Projection) Method with Momentum Term and Adaptive Stepsize Rule
- On linear convergence of iterative methods for the variational inequality problem
- Application Of Khobotov’s Algorithm To Variational Inequalities And Network Equilibrium Problems
- Two-Metric Projection Methods for Constrained Optimization
- Inexact Newton Methods
- Modified Projection-Type Methods for Monotone Variational Inequalities
- Title not available (Why is that?)
- Error bounds and convergence analysis of feasible descent methods: A general approach
- Incremental Least Squares Methods and the Extended Kalman Filter
- Error stability properties of generalized gradient-type algorithms
- Error Bound and Convergence Analysis of Matrix Splitting Algorithms for the Affine Variational Inequality Problem
- Computational methods in optimization. A unified approach.
- A Stability Analysis for Perturbed Nonlinear Iterative Methods
- New inexact parallel variable distribution algorithms
- Convergence of Iterates of an Inexact Matrix Splitting Algorithm for the Symmetric Monotone Linear Complementarity Problem
- Remarks on Convergence of the Matrix Splitting Algorithm for the Symmetric Linear Complementarity Problem
- Mathematical Programming in Neural Networks
- New Error Bounds for the Linear Complementarity Problem
- On a global projection-type error bound for the linear complementarity problem
- Convergence properties of the gradient method under conditions of variable-level interference
Cited In (14)
- Global convergence of the Dai-Yuan conjugate gradient method with perturbations
- On approximations with finite precision in bundle methods for nonsmooth optimization
- Descent methods with linesearch in the presence of perturbations
- Title not available (Why is that?)
- A Unified Analysis of Descent Sequences in Weakly Convex Optimization, Including Convergence Rates for Bundle Methods
- Smooth sparse coding via marginal regression for learning sparse representations
- Error bounds and convergence analysis of feasible descent methods: A general approach
- Perturbation techniques for convergence analysis of proximal gradient method and other first-order algorithms via variational analysis
- Error stability properties of generalized gradient-type algorithms
- Bounded perturbation resilience of projected scaled gradient methods
- Convergence property of gradient-type methods with non-monotone line search in the presence of perturbations
- Convergence analysis of the ChebFilterCG algorithm
- On the Convergence to Stationary Points of Deterministic and Randomized Feasible Descent Directions Methods
- Convergence of algorithms for perturbed optimization problems
This page was built for publication: Convergence analysis of perturbed feasible descent methods
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1379956)