An efficient gradient method with approximately optimal stepsize based on tensor model for unconstrained optimization
From MaRDI portal
Publication:2420801
DOI10.1007/s10957-019-01475-1zbMath1420.90035arXiv1907.10494OpenAlexW2913634680MaRDI QIDQ2420801
Publication date: 7 June 2019
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1907.10494
Related Items (3)
Stochastic perturbation of subgradient algorithm for nonconvex deep neural networks ⋮ New investigation for the Liu-Story scaled conjugate gradient method for nonlinear optimization ⋮ An improved Dai-Kou conjugate gradient algorithm for unconstrained optimization
Uses Software
Cites Work
- Unnamed Item
- A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations
- A Barzilai-Borwein conjugate gradient method
- Modified subspace Barzilai-Borwein gradient method for non-negative matrix factorization
- Scaling on the spectral gradient method
- Open problems in nonlinear conjugate gradient algorithms for unconstrained optimization
- Scalar correction method for solving large scale unconstrained minimization problems
- Convergence analysis of a modified BFGS method on convex minimizations
- Notes on the Dai-Yuan-Yuan modified spectral gradient method
- New quasi-Newton equation and related methods for unconstrained optimization
- Modified two-point stepsize gradient methods for unconstrained optimization
- A new adaptive Barzilai and Borwein method for unconstrained optimization
- An efficient Barzilai-Borwein conjugate gradient method for unconstrained optimization
- An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization
- Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search
- An efficient monotone projected Barzilai-Borwein method for nonnegative matrix factorization
- Smoothing projected Barzilai-Borwein method for constrained non-Lipschitz optimization
- The global convergence of a modified BFGS method for nonconvex functions
- Several efficient gradient methods with approximate optimal stepsizes for large scale unconstrained optimization
- R-linear convergence of the Barzilai and Borwein gradient method
- On spectral properties of steepest descent methods
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- A Modified BFGS Algorithm for Unconstrained Optimization
- Algorithm 851
- Two-Point Step Size Gradient Methods
- Tensor Methods for Unconstrained Optimization Using Second Derivatives
- Algorithm 738: a software package for unconstrained optimization using tensor methods
- Tensor Methods for Large, Sparse Unconstrained Optimization
- Nonmonotone Spectral Projected Gradient Methods on Convex Sets
- Sparse Reconstruction by Separable Approximation
- An efficient gradient method with approximate optimal stepsize for the strictly convex quadratic minimization problem
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- A Nonmonotone Line Search Technique for Newton’s Method
- An Assessment of Nonmonotone Linesearch Techniques for Unconstrained Optimization
- An adaptive nonmonotone global Barzilai–Borwein gradient method for unconstrained optimization
- On the Barzilai and Borwein choice of steplength for the gradient method
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- CUTEr and SifDec
- The cyclic Barzilai-–Borwein method for unconstrained optimization
- A modified BFGS method and its global convergence in nonconvex minimization
- Benchmarking optimization software with performance profiles.
This page was built for publication: An efficient gradient method with approximately optimal stepsize based on tensor model for unconstrained optimization