Rates of Convergence for Conditional Gradient Algorithms Near Singular and Nonsingular Extremals
From MaRDI portal
Publication:4188351
DOI10.1137/0317015zbMath0403.49028MaRDI QIDQ4188351
No author found.
Publication date: 1979
Published in: SIAM Journal on Control and Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/0317015
65K10: Numerical optimization and variational techniques
49M05: Numerical methods based on necessary conditions
93B05: Controllability
Related Items
Minimization of the Tikhonov functional in Banach spaces smooth and convex of power type by steepest descent in the dual, Rates of convergence for adaptive Newton methods, Finite convergence of algorithms for nonlinear programs and variational inequalities, A generalized conditional gradient method and its connection to an iterative shrinkage method, The effect of perturbations on the convergence rates of optimization algorithms, Extremal types for certain \(L^ p \)minimization problems and associated large scale nonlinear programs, A class of superlinearly convergent projection algorithms with relaxed stepsizes, Convergent stepsizes for constrained optimization algorithms, Sublinear convergence of the chord method at singular points, Newton-Goldstein convergence rates for convex constrained minimization problems with singular solutions, Convergence of algorithms for perturbed optimization problems, A Relaxed Picard Iteration Process for Set-Valued Operators of the Monotone Type