Two-Point Step Size Gradient Methods
From MaRDI portal
Publication:3779680
DOI10.1093/IMANUM/8.1.141zbMATH Open0638.65055OpenAlexW2076605490WikidataQ56935973 ScholiaQ56935973MaRDI QIDQ3779680FDOQ3779680
Authors: Jonathan Barzilai, Jonathan M. Borwein
Publication date: 1988
Published in: IMA Journal of Numerical Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1093/imanum/8.1.141
Recommendations
Numerical mathematical programming methods (65K05) Numerical optimization and variational techniques (65K10) Nonlinear programming (90C30)
Cited In (only showing first 100 items - show all)
- Spatio-temporal random fields: compressible representation and distributed estimation
- Nonmonotone adaptive trust region method with line search based on new diagonal updating
- A spectral algorithm for large-scale systems of nonlinear monotone equations
- Greatest descent algorithms in unconstrained optimization
- A review of nonlinear FFT-based computational homogenization methods
- Accumulative approach in multistep diagonal gradient-type method for large-scale unconstrained optimization
- On the asymptotic behaviour of some new gradient methods
- A new gradient method with an optimal stepsize property
- Accelerated linearized Bregman method
- A modified spectral conjugate gradient method for solving unconstrained minimization problems
- A derivative-free nonmonotone line-search technique for unconstrained optimization
- Preconditioning non-monotone gradient methods for retrieval of seismic reflection signals
- Semiparametric Regression Analysis of Panel Count Data: A Practical Review
- Low-rank spectral optimization via gauge duality
- Projection onto a polyhedron that exploits sparsity
- Density-based globally convergent trust-region methods for self-consistent field electronic structure calculations
- Composite SAR imaging using sequential joint sparsity
- An Orthogonalization-Free Parallelizable Framework for All-Electron Calculations in Density Functional Theory
- A double-projection-based algorithm for large-scale nonlinear systems of monotone equations
- Asymptotic behaviour of a family of gradient algorithms in \(\mathbb R^{ d }\) and Hilbert spaces
- A convergent least-squares regularized blind deconvolution approach
- An implicit preconditioning strategy for large-scale generalized Sylvester equations
- Adaptive two-point stepsize gradient algorithm
- A Positive Barzilai–Borwein-Like Stepsize and an Extension for Symmetric Linear Systems
- Projected Barzilai-Borwein method for large-scale nonnegative image restoration
- Alternate step gradient method*
- Learning from comparisons and choices
- On the steplength selection in gradient methods for unconstrained optimization
- An augmented Lagrangian method for non-Lipschitz nonconvex programming
- Differential equations and solution of linear systems
- A parallel orbital-updating based optimization method for electronic structure calculations
- A descent Dai-Liao conjugate gradient method for nonlinear equations
- Two derivative-free projection approaches for systems of large-scale nonlinear monotone equations
- A Barzilai-Borwein-based heuristic algorithm for locating multiple facilities with regional demand
- Non-smooth equations based method for \(\ell_1\)-norm problems with applications to compressed sensing
- A Barzilai-Borwein conjugate gradient method
- Non-monotone algorithm for minimization on arbitrary domains with applications to large-scale orthogonal Procrustes problem
- A new nonmonotone spectral residual method for nonsmooth nonlinear equations
- On the worst case performance of the steepest descent algorithm for quadratic functions
- Preconditioned Barzilai-Borwein method for the numerical solution of partial differential equations
- Nonmonotone algorithm for minimization on closed sets with applications to minimization on Stiefel manifolds
- Sparse Recovery via Partial Regularization: Models, Theory, and Algorithms
- Implementation of an optimal first-order method for strongly convex total variation regularization
- On the nonmonotone line search
- Expectation propagation for nonlinear inverse problems -- with an application to electrical impedance tomography
- Dynamic multi-source X-ray tomography using a spacetime level set method
- Computationally efficient approach for the minimization of volume constrained vector-valued Ginzburg-Landau energy functional
- Modified subspace Barzilai-Borwein gradient method for non-negative matrix factorization
- An affine scaling method for optimization problems with polyhedral constraints
- On the steepest descent algorithm for quadratic functions
- An asymptotically optimal gradient algorithm for quadratic optimization with low computational cost
- Gradient algorithms for quadratic optimization with fast convergence rates
- A new spectral conjugate gradient method for large-scale unconstrained optimization
- Inverse determination of a heat source from natural convection in a porous cavity
- Accelerated Bregman method for linearly constrained \(\ell _1-\ell _2\) minimization
- Scaling on the spectral gradient method
- Scaled diagonal gradient-type method with extra update for large-scale unconstrained optimization
- A modified conjugate gradient algorithm with cyclic Barzilai-Borwein steplength for unconstrained optimization
- A family of derivative-free conjugate gradient methods for large-scale nonlinear systems of equations
- A new two-step gradient-type method for large-scale unconstrained optimization
- Two effective hybrid conjugate gradient algorithms based on modified BFGS updates
- Convex regularization in statistical inverse learning problems
- A regularized Newton method for computing ground states of Bose-Einstein condensates
- Monotone projected gradient methods for large-scale box-constrained quadratic programming
- New spectral LS conjugate gradient method for nonlinear unconstrained optimization
- Modified two-point stepsize gradient methods for unconstrained optimization
- Multivariate spectral gradient method for unconstrained optimization
- Estimation of spectral bounds in gradient algorithms
- A globally convergent derivative-free method for solving large-scale nonlinear monotone equations
- A new gradient method via quasi-Cauchy relation which guarantees descent
- Computing the generalized eigenvalues of weakly symmetric tensors
- A modified Perry's conjugate gradient method-based derivative-free method for solving large-scale nonlinear monotone equations
- An optimal tri-vector iterative algorithm for solving ill-posed linear inverse problems
- Convex constrained optimization for large-scale generalized Sylvester equations
- Modified nonmonotone Armijo line search for descent method
- A practical method for solving large-scale TRS
- Impulse noise removal by a nonmonotone adaptive gradient method
- Large correlation analysis
- A class of line search-type methods for nonsmooth convex regularized minimization
- Spectral gradient methods for linearly constrained optimization
- A non-monotonic method for large-scale non-negative least squares
- Smoothing projected cyclic Barzilai–Borwein method for stochastic linear complementarity problems
- Two maxentropic approaches to determine the probability density of compound risk losses
- A proximal iteratively regularized Gauss-Newton method for nonlinear inverse problems
- Lifetime dependence modelling using a truncated multivariate gamma distribution
- On the asymptotic convergence and acceleration of gradient methods
- Convergence properties of nonmonotone spectral projected gradient methods
- On \(R\)-linear convergence analysis for a class of gradient methods
- Scaling techniques for gradient projection-type methods in astronomical image deblurring
- An efficient Gauss-Newton algorithm for symmetric low-rank product matrix approximations
- Semi-supervised learning with nuclear norm regularization
- Computing extreme eigenvalues of large scale Hankel tensors
- Barzilai–Borwein method with variable sample size for stochastic linear complementarity problems
- Convergence of descent method without line search
- Use of the minimum norm search direction in a nonmonotone version of the Gauss-Newton method
- Global optimization with orthogonality constraints via stochastic diffusion on manifold
- Maximum entropy and feasibility methods for convex and nonconvex inverse problems
- Global convergence of a spectral conjugate gradient method for unconstrained optimization
- Spectral conjugate gradient methods for vector optimization problems
- Inexact variable metric method for convex-constrained optimization problems
This page was built for publication: Two-Point Step Size Gradient Methods
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3779680)