The Convergence of a Class of Double-rank Minimization Algorithms
From MaRDI portal
Publication:5607585
Cited in
(only showing first 100 items - show all)- On quasi-Newton methods in fast Fourier transform-based micromechanics
- Self-Scaling Variable Metric Algorithms without Line Search for Unconstrained Minimization
- Robust min-max portfolio strategies for rival forecast and risk scenarios
- On Variable-Metric Methods for Sparse Hessians
- The global convergence of a modified BFGS method for nonconvex functions
- Elimination of bounds in optimization problems by transforming variables
- Self-similar solution of a plane-strain fracture driven by a power-law fluid
- A state space model approach for HIV infection dynamics
- A review of nonlinear FFT-based computational homogenization methods
- Global optimization for data assimilation in landslide tsunami models
- A global optimization problem in portfolio selection
- Parallel two-phase methods for global optimization on GPU
- Towards explicit superlinear convergence rate for SR1
- Basin hopping with synched multi L-BFGS local searches. Parallel implementation in multi-CPU and GPUs
- Perspectives on self-scaling variable metric algorithms
- Cubic regularization in symmetric rank-1 quasi-Newton methods
- A family of variable metric methods in function space, without exact line searches
- A nonlinear model for function-value multistep methods
- Algorithms for strong coupling procedures
- Function-space quasi-Newton algorithms for optimal control problems with bounded controls and singular arcs
- Recent advances in unconstrained optimization
- Efficient and robust density estimation using Bernstein type polynomials
- Algorithms for nonlinear constraints that use lagrangian functions
- Maximum entropy derivation of quasi-Newton methods
- Second-order stochastic optimization for machine learning in linear time
- Mesh independence of Newton-like methods for infinite dimensional problems
- The kinematics and static equilibria of a Slinky
- Simple and cumulative regret for continuous noisy optimization
- Yield design theory: An efficient static method formulation
- A faster modified Newton-Raphson iteration
- Optimal control for fast and robust generation of entangled states in anisotropic Heisenberg chains
- The global convergence of the BFGS method with a modified WWP line search for nonconvex functions
- Two modified scaled nonlinear conjugate gradient methods
- A modified secant equation quasi-Newton method for unconstrained optimization
- Numerical experience with multiple update quasi-Newton methods for unconstrained optimization
- Elastodynamics of thin plates with internal dissipative processes. II: Computational aspects
- Instability analysis of thin plates and arbitrary shells using a faceted shell element with loof nodes
- On the order of convergence of certain quasi-Newton methods
- Stochastic Steffensen method
- Minimum curvature multistep quasi-Newton methods
- Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search
- Annealing stochastic approximation Monte Carlo algorithm for neural network training
- Computational experience with methods for estimating sparse hessians for nonlinear optimization
- Using nonlinear functions to approximate a new quasi-Newton method for unconstrained optimization problems
- New results on superlinear convergence of classical quasi-Newton methods
- Generation of classes of symmetric rank-2 secant updates and the maximality of the Davidon class
- A search grid for parameter optimization as a byproduct of model sensitivity analysis
- Minimum Norm Symmetric Quasi-Newton Updates Restricted to Subspaces
- Greedy quasi-Newton methods with explicit superlinear convergence
- A modified BFGS algorithm based on a hybrid secant equation
- A derivative-free optimization algorithm based on conditional moments
- Modeling noisy data with differential equations using observed and expected matrices
- Small collaboration: Advanced numerical methods for nonlinear hyperbolic balance laws and their applications. Abstracts from the small collaboration held August 29 -- September 4, 2021 (hybrid meeting)
- Rates of superlinear convergence for classical quasi-Newton methods
- On the inversion-free Newton's method and its applications
- Matrix factorizations in optimization of nonlinear functions subject to linear constraints
- Properties of updating methods for the multipliers in augmented Lagrangians
- On averaging and representation properties of the BFGS and related secant updates
- A modified Newton's method for minimizing factorable functions
- A variable metric algorithm for unconstrained minimization without evaluation of derivatives
- A review of the optimal power flow
- Implicit updates in multistep quasi-Newton methods
- Updating Quasi-Newton Matrices with Limited Storage
- Parallel variable metric algorithms for unconstrained optimization
- Finite-sample properties of limited-iinformation estimators in misspecified simultaneous equation models
- Robust optimizers for nonlinear programming in approximate dynamic programming
- scientific article; zbMATH DE number 3812857 (Why is no real title available?)
- A new BFGS algorithm using the decomposition matrix of the correction matrix to obtain the search directions
- Alternating multi-step quasi-Newton methods for unconstrained optimization
- On the sufficient descent property of the Shanno's conjugate gradient method
- Accelerated conjugate direction methods for unconstrained optimization
- On the construction of minimization methods of quasi-Newton type
- Competitive secant (BFGS) methods based on modified secant relations for unconstrained optimization
- Partial derivatives for the first-passage time distribution in Wiener diffusion models
- Approximation BFGS methods for nonlinear image restoration
- New implicit updates in multi-step quasi-Newton methods for unconstrained optimisation
- Using function-values in multi-step quasi-Newton methods
- Computation of three-dimensional standing water waves
- The Poisson maximum entropy model for homogeneous Poisson processes
- Convergence Rates of Evolutionary Algorithms and Parallel Evolutionary Algorithms
- A class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS update
- Partitioned simulation of fluid-structure interaction. Coupling black-box solvers with quasi-Newton techniques
- A perfect example for the BFGS method
- Scalable approximations for generalized linear problems
- A family of hybrid conjugate gradient methods for unconstrained optimization
- A new version of augmented self-scaling BFGS method
- A dynamical view of nonlinear conjugate gradient methods with applications to FFT-based computational micromechanics
- Self-consistent description of radial space-charge confinement in DC column plasmas
- Factorized Variable Metric Methods for Unconstrained Optimization
- Planar quasi-Newton algorithms for unconstrained saddlepoint problems
- Green behavior propagation analysis based on statistical theory and intelligent algorithm in data-driven environment
- MERLIN-3. 0. A multidimensional optimization environment
- A new modified BFGS method for unconstrained optimization problems
- Secant relations versus positive definiteness in quasi-Newton methods
- Parallel implementation of semiempirical quantum methods for the Intel platforms
- A stiffness matrix extrapolation strategy for nonlinear analysis
- An example of numerical nonconvergence of a variable-metric method
- Three-step fixed-point quasi-Newton methods for unconstrained optimisation
- The projection technique for two open problems of unconstrained optimization problems
- Efficient Semiparametric Estimation of Short‐Term and Long‐Term Hazard Ratios with Right‐Censored Data
This page was built for publication: The Convergence of a Class of Double-rank Minimization Algorithms
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5607585)