L-BFGS
From MaRDI portal
Software:15762
swMATH3229MaRDI QIDQ15762FDOQ15762
Author name not available (Why is that?)
Cited In (only showing first 100 items - show all)
- Distributed optimization and statistical learning via the alternating direction method of multipliers
- Mathematical optimization in intensity modulated radiation therapy
- Tree approximation for discrete time stochastic processes: a process distance approach
- Reduced Storage, Quasi-Newton Trust Region Approaches to Function Optimization
- Perspectives in Flow Control and Optimization
- pyvine: the Python package for regular vine copula modeling, sampling and testing
- SNOPT: An SQP Algorithm for Large-Scale Constrained Optimization
- Title not available (Why is that?)
- Krylov space approximate Kalman filtering
- Tackling box-constrained optimization via a new projected quasi-Newton approach
- Structured minimal-memory inexact quasi-Newton method and secant preconditioners for augmented Lagrangian optimization
- Physics-informed neural network simulation of multiphase poroelasticity using stress-split sequential training
- Minimizing a sum of clipped convex functions
- Title not available (Why is that?)
- A limited memory quasi-Newton trust-region method for box constrained optimization
- Quasi-Newton methods based on ordinary differential equation approach for unconstrained nonlinear optimization
- Identification of Manning's roughness coefficients in channel network using adjoint analysis
- On the use of stochastic Hessian information in optimization methods for machine learning
- Subspace methods for large scale nonlinear equations and nonlinear least squares
- Globally convergent limited memory bundle method for large-scale nonsmooth optimization
- Large-scale Kalman filtering using the limited memory BFGS method
- Trust region Newton method for logistic regression
- An iterated local search algorithm based on nonlinear programming for the irregular strip packing problem
- Jointly robust prior for Gaussian stochastic process in emulation, calibration and variable selection
- A hybrid random field model for scalable statistical learning
- A limited memory BFGS algorithm for non-convex minimization with applications in matrix largest eigenvalue problem
- Lifted generative learning of Markov logic networks
- Semi-parametric estimation of multivariate extreme expectiles
- Fitting very large sparse Gaussian graphical models
- An active set limited memory BFGS algorithm for bound constrained optimization
- A limited memory steepest descent method
- Scaled memoryless symmetric rank one method for large-scale optimization
- Representations of quasi-Newton matrices and their use in limited memory methods
- An efficient surrogate model for emulation and physics extraction of large eddy simulations
- An active-set projected trust region algorithm for box constrained optimization problems
- Optimization theory and methods. Nonlinear programming
- A constrained optimization algorithm for total energy minimization in electronic structure calculations
- A quasi-Newton approach to nonsmooth convex optimization problems in machine learning
- Active control and drag optimization for flow past a circular cylinder. I: Oscillatory cylinder rotation
- Smoothing nonlinear conjugate gradient method for image restoration using nonsmooth nonconvex minimization
- A limited memory BFGS-type method for large-scale unconstrained optimization
- Gradient trust region algorithm with limited memory BFGS update for nonsmooth convex minimization
- Computational optimization of systems governed by partial differential equations
- Global convergence of online limited memory BFGS
- A numerical study of limited memory BFGS methods
- SGD-QN: careful quasi-Newton stochastic gradient descent
- On the behavior of the gradient norm in the steepest descent method
- A survey of nonlinear conjugate gradient methods
- Newton's Method for Large Bound-Constrained Optimization Problems
- Optimal control of flow with discontinuities.
- On the resolution of monotone complementarity problems
- On efficiently computing the eigenvalues of limited-memory quasi-Newton matrices
- Conjugate gradient methods with Armijo-type line searches.
- An efficient multigrid strategy for large-scale molecular mechanics optimization
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Computing eigenelements of real symmetric matrices via optimization
- Optimal control of the unsteady Navier-Stokes equations
- Spectral gradient projection method for solving nonlinear monotone equations
- New limited memory bundle method for large-scale nonsmooth optimization
- CUTE
- Comparison of Gaussian process modeling software
- A practical PR+ conjugate gradient method only using gradient
- The variational Kalman filter and an efficient implementation using limited memory BFGS
- A Numerical Study of the Limited Memory BFGS Method and the Truncated-Newton Method for Large Scale Optimization
- Symplectic Gaussian process regression of maps in Hamiltonian systems
- Algorithm 809: PREQN
- CUTEr and SifDec
- On optimal solution error covariances in variational data assimilation problems
- An efficient method for nonlinearly constrained networks
- An unconstrained smooth minimization reformulation of the second-order cone complementarity problem
- Title not available (Why is that?)
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- iNEOS: An interactive environment for nonlinear optimization
- Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search
- Markov logic networks
- Beta autoregressive fractionally integrated moving average models
- MAGMA: inference and prediction using multi-task Gaussian processes with common mean
- Optical tomography: forward and inverse problems
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- Truncated-Newton algorithms for large-scale unconstrained optimization
- Line search algorithms with guaranteed sufficient decrease
- Newton-Type Minimization via the Lanczos Method
- Optimizing the Delivery of Radiation Therapy to Cancer Patients
- Dynamic scaling based preconditioning for truncated Newton methods in large scale unconstrained optimization
- A new three-term conjugate gradient algorithm for unconstrained optimization
- A New Active Set Algorithm for Box Constrained Optimization
- On the limited memory BFGS method for large scale optimization
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Title not available (Why is that?)
- Some descent three-term conjugate gradient methods and their global convergence
- Hilbert class library: A library of abstract C++ classes for optimization and inversion
- A fast algorithm for sparse reconstruction based on shrinkage, subspace optimization, and continuation
- Numerical experiments with the Lancelot package (Release \(A\)) for large-scale nonlinear optimization
- Algorithm 943: MSS: MATLAB software for L-BFGS trust-region subproblems for large-scale optimization
- Some numerical experiments with variable-storage quasi-Newton algorithms
- Scaled Gaussian stochastic process for computer model calibration and prediction
- A coordinate gradient descent method for nonsmooth separable minimization
- A theoretical framework of the scaled Gaussian stochastic process in prediction and calibration
- Acceleration of conjugate gradient algorithms for unconstrained optimization
- Automatic Preconditioning by Limited Memory Quasi-Newton Updating
This page was built for software: L-BFGS