Tensor Methods for Unconstrained Optimization Using Second Derivatives
From MaRDI portal
Publication:4012411
DOI10.1137/0801020zbMath0758.65047OpenAlexW2012604196MaRDI QIDQ4012411
T. T. Chow, Robert B. Schnabel
Publication date: 27 September 1992
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/0801020
Newton's methodtest problemstrust region techniquesingular problemstensor methodhigher order modeltensor modelunconstrained minimization algorithm
Related Items
A new subspace minimization conjugate gradient method based on tensor model for unconstrained optimization, Tensor methods for finding approximate stationary points of convex functions, Tensor methods for full-information maximum likelihood estimation: Unconstrained estimation, Quasi-Newton method by Hermite interpolation, Dealing with singularities in nonlinear unconstrained optimization, A numerically stable reduced-gradient type algorithm for solving large- scale linearly constrained minimization problems, A type of modified BFGS algorithm with any rank defects and the local \(Q\)-superlinear convergence properties, Tensor methods of full-information maximum likelihood estimation: Estimation with parameter constraints, Solution of finite-dimensional variational inequalities using smooth optimization with simple bounds, A compact limited memory method for large scale unconstrained optimization, Smoothness parameter of power of Euclidean norm, An efficient gradient method with approximately optimal stepsize based on tensor model for unconstrained optimization, On High-order Model Regularization for Constrained Optimization, A derivative-free modified tensor method with curvilinear linesearch for unconstrained nonlinear programming, Universal Regularization Methods: Varying the Power, the Smoothness and the Accuracy, A new conjugate gradient algorithm with cubic Barzilai–Borwein stepsize for unconstrained optimization, Implementable tensor methods in unconstrained convex optimization, Higher order curvature information and its application in a modified diagonal Secant method, Nonlinear coordinate transformations for unconstrained optimization. II: Theoretical background, Numerical multilinear algebra and its applications, A curvilinear search algorithm for unconstrained optimization by automatic differentiation, A modified Brown algorithm for solving singular nonlinear systems with rank defects, Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations, An adaptive conic trust-region method for unconstrained optimization, Separable cubic modeling and a trust-region strategy for unconstrained minimization with impact in global optimization, Tensor Methods for Minimizing Convex Functions with Hölder Continuous Higher-Order Derivatives, Local convergence analysis of tensor methods for nonlinear equations