A Riemannian dimension-reduced second-order method with application in sensor network localization
From MaRDI portal
Publication:6562381
Recommendations
- Riemannian stochastic variance-reduced cubic regularized Newton method for submanifold optimization
- Adaptive quadratically regularized Newton method for Riemannian optimization
- Faster Riemannian Newton-type optimization by subsampling and cubic regularization
- A feasible method for sensor network localization
- Sequential quadratic optimization for nonlinear optimization problems on Riemannian manifolds
Cites work
- scientific article; zbMATH DE number 5060482 (Why is no real title available?)
- A Broyden class of quasi-Newton methods for Riemannian optimization
- A Dai-Yuan-type Riemannian conjugate gradient method with the weak Wolfe conditions
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- A Feasible Method for Solving an SDP Relaxation of the Quadratic Knapsack Problem
- A Unified Theorem on SDP Rank Reduction
- A distributed method for solving semidefinite programs arising from ad hoc wireless sensor network localization
- A new, globally convergent Riemannian conjugate gradient method
- A nonlinear programming algorithm for solving semidefinite programs via low-rank factorization
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- Adaptive quadratically regularized Newton method for Riemannian optimization
- Adaptive regularization with cubics on manifolds
- An inertial Newton algorithm for deep learning
- Computation of ground states of the Gross-Pitaevskii functional via Riemannian optimization
- Cubic regularization of Newton method and its global performance
- Elliptic preconditioner for accelerating the self-consistent field iteration in Kohn-Sham density functional theory
- Error estimates for iterative algorithms for minimizing regularized quadratic subproblems
- Estimating the Largest Eigenvalue by the Power and Lanczos Algorithms with a Random Start
- Finding stationary points on bounded-rank matrices: a geometric hurdle and a smooth remedy
- Global rates of convergence for nonconvex optimization on manifolds
- Local minima and convergence in low-rank semidefinite programming
- Low-rank optimization on the cone of positive semidefinite matrices
- Manopt, a Matlab toolbox for optimization on manifolds
- Newton's method on Riemannian manifolds and a geometric model for the human spine
- On solving trust-region and other regularised subproblems in optimization
- On the complexity of steepest descent, Newton's and regularized Newton's methods for nonconvex unconstrained optimization problems
- On the convergence of the self-consistent field iteration in Kohn-Sham density functional theory
- On the rank of extreme matrices in semidefinite programs and the multiplicity of optimal eigenvalues
- Preconditioned low-rank Riemannian optimization for linear systems with tensor product structure
- QSDPNAL: a two-phase augmented Lagrangian method for convex quadratic semidefinite programming
- Riemannian optimization for high-dimensional tensor completion
- Robust low-rank matrix completion by Riemannian optimization
- Solving Euclidean distance matrix completion problems via semidefinite progrmming
- Solving Large-Scale Cubic Regularization by a Generalized Eigenvalue Problem
- Solving graph equipartition SDPs on an algebraic variety
- Solving the Trust-Region Subproblem using the Lanczos Method
- The Molecule Problem: Exploiting Structure in Global Optimization
- Theory of semidefinite programming for sensor network localization
- Trust-region methods on Riemannian manifolds
This page was built for publication: A Riemannian dimension-reduced second-order method with application in sensor network localization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6562381)