Proximal quasi-Newton methods for nondifferentiable convex optimization

From MaRDI portal
Revision as of 11:03, 31 January 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:1300271

DOI10.1007/S101070050059zbMath0946.90111OpenAlexW2075146859MaRDI QIDQ1300271

Masao Fukushima, Xiaojun Chen

Publication date: 25 October 2000

Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1007/s101070050059




Related Items (36)

Forward-backward quasi-Newton methods for nonsmooth optimization problemsCombination of steepest descent and BFGS methods for nonconvex nonsmooth optimizationSurvey of Bundle Methods for Nonsmooth OptimizationAn ODE-like nonmonotone method for nonsmooth convex optimizationOn the resolution of the variational inequalities of the first and the second kind as equations obtained by explicit Moreau-Yosida regularizationsOn superlinear convergence of quasi-Newton methods for nonsmooth equationsThe developments of proximal point algorithmsA bundle modification strategy for convex minimizationOn approximations with finite precision in bundle methods for nonsmooth optimizationA quasi-Newton bundle method based on approximate subgradientsA conjugate gradient sampling method for nonsmooth optimizationA method for convex minimization based on translated first-order approximationsAn efficient conjugate gradient method with strong convergence properties for non-smooth optimizationA \(J\)-symmetric quasi-Newton method for minimax problemsConjugate gradient type methods for the nondifferentiable convex minimizationA globally convergent proximal Newton-type method in nonsmooth convex optimizationA modified conjugate gradient method for general convex functionsAn Asymptotically Superlinearly Convergent Semismooth Newton Augmented Lagrangian Method for Linear ProgrammingA modified scaled memoryless BFGS preconditioned conjugate gradient algorithm for nonsmooth convex optimizationRegularizations for stochastic linear variational inequalitiesComparison of two proximal point algorithms for monotone variational inequalitiesAnalysis of the maximum magnification by the scaled memoryless DFP updating formula with application to compressive sensingGlobally tight bounds for almost differentiable functions over polytopes with application to tolerance analysis.An adaptive competitive penalty method for nonsmooth constrained optimizationAn approximate proximal-extragradient type method for monotone variational inequalitiesPreconditioning of a Generalized Forward-Backward Splitting and Application to Optimization on GraphsEssentials of numerical nonsmooth optimizationOn the convergence of a multigrid method for Moreau-regularized variational inequalities of the second kindA New Nonmonotone Linesearch SQP Algorithm for Unconstrained Minimax ProblemAn Inexact Variable Metric Proximal Point Algorithm for Generic Quasi-Newton AccelerationGlobally convergent BFGS method for nonsmooth convex optimizationVariable metric forward–backward splitting with applications to monotone inclusions in dualityThe indefinite proximal point algorithms for maximal monotone operatorsA \(\mathcal{VU}\)-algorithm for convex minimizationEssentials of numerical nonsmooth optimizationA memory gradient method for non-smooth convex optimization







This page was built for publication: Proximal quasi-Newton methods for nondifferentiable convex optimization