Adaptive quadratically regularized Newton method for Riemannian optimization
From MaRDI portal
Publication:3176355
Abstract: Optimization on Riemannian manifolds widely arises in eigenvalue computation, density functional theory, Bose-Einstein condensates, low rank nearest correlation, image registration, and signal processing, etc. We propose an adaptive regularized Newton method which approximates the original objective function by the second-order Taylor expansion in Euclidean space but keeps the Riemannian manifold constraints. The regularization term in the objective function of the subproblem enables us to establish a Cauchy-point like condition as the standard trust-region method for proving global convergence. The subproblem can be solved inexactly either by first-order methods or a modified Riemannian Newton method. In the later case, it can further take advantage of negative curvature directions. Both global convergence and superlinear local convergence are guaranteed under mild conditions. Extensive computational experiments and comparisons with other state-of-the-art methods indicate that the proposed algorithm is very promising.
Recommendations
- A Broyden class of quasi-Newton methods for Riemannian optimization
- Riemannian stochastic variance-reduced cubic regularized Newton method for submanifold optimization
- Faster Riemannian Newton-type optimization by subsampling and cubic regularization
- Sequential quadratic optimization for nonlinear optimization problems on Riemannian manifolds
- Adaptive regularization with cubics on manifolds
Cites work
- scientific article; zbMATH DE number 681023 (Why is no real title available?)
- scientific article; zbMATH DE number 5223994 (Why is no real title available?)
- scientific article; zbMATH DE number 5060482 (Why is no real title available?)
- A Broyden class of quasi-Newton methods for Riemannian optimization
- A New First-Order Algorithmic Framework for Optimization Problems with Orthogonality Constraints
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- A Riemannian BFGS method for nonconvex optimization problems
- A Riemannian symmetric rank-one trust-region method
- A feasible method for optimization with orthogonality constraints
- A framework of constraint preserving update schemes for optimization on Stiefel manifold
- A proximal gradient method for ensemble density functional theory
- A regularized Newton method for computing ground states of Bose-Einstein condensates
- Accelerated methods for nonconvex optimization
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- Adaptive regularized self-consistent field iteration with exact Hessian for electronic structure calculation
- Adaptive subgradient methods for online learning and stochastic optimization
- Algebraic rules for quadratic regularization of Newton's method
- An Extrinsic Look at the Riemannian Hessian
- An inexact successive quadratic approximation method for L-1 regularized optimization
- Exploiting negative curvature directions in linesearch methods for unconstrained optimization
- Finding approximate local minima faster than gradient descent
- Globally convergent optimization algorithms on Riemannian manifolds: Uniform framework for unconstrained and constrained optimization
- Gradient type optimization methods for electronic structure calculations
- KSSOLV -- a MATLAB toolbox for solving the Kohn-Sham equations
- Low-rank matrix completion by Riemannian optimization
- Low-rank tensor completion by Riemannian optimization
- Manopt, a Matlab toolbox for optimization on manifolds
- Minimizing a differentiable function over a differential manifold
- Optimality conditions for the nonlinear programming problems on Riemannian manifolds
- Optimization Techniques on Riemannian Manifolds
- Optimization methods on Riemannian manifolds and their application to shape space
- Projection-like retractions on matrix manifolds
- Proximal Newton-type methods for minimizing composite functions
- The Conjugate Gradient Method and Trust Regions in Large Scale Optimization
- The Geometry of Algorithms with Orthogonality Constraints
- The Riemannian Barzilai-Borwein method with nonmonotone line search and the matrix geometric mean computation
- The use of quadratic regularization with a cubic descent condition for unconstrained optimization
- Trust-region methods on Riemannian manifolds
Cited in
(37)- Orthogonal canonical correlation analysis and applications
- An adaptive regularized proximal Newton-type methods for composite optimization over the Stiefel manifold
- scientific article; zbMATH DE number 5657033 (Why is no real title available?)
- Newton-based methods for finding the positive ground state of Gross-Pitaevskii equations
- An extended delayed weighted gradient algorithm for solving strongly convex optimization problems
- A communication-efficient and privacy-aware distributed algorithm for sparse PCA
- A brief introduction to manifold optimization
- Finding the global optimum of a class of quartic minimization problem
- Adaptive regularization with cubics on manifolds
- Global convergence of Riemannian line search methods with a Zhang-Hager-type condition
- On the geometric analysis of a quartic-quadratic optimization problem under a spherical constraint
- An exact penalty approach for optimization with nonnegative orthogonality constraints
- Energy-adaptive Riemannian optimization on the Stiefel manifold
- New vector transport operators extending a Riemannian CG algorithm to generalized Stiefel manifold with low-rank applications
- Proximal quasi-Newton method for composite optimization over the Stiefel manifold
- An entropy-regularized ADMM for binary quadratic programming
- A class of smooth exact penalty function methods for optimization problems with orthogonality constraints
- Riemannian smoothing gradient type algorithms for nonsmooth optimization problem on compact Riemannian submanifold embedded in Euclidean space
- Ground states of spin-\(F\) Bose-Einstein condensates
- Riemannian Newton methods for energy minimization problems of Kohn-Sham type
- Riemannian preconditioning
- A trust region method for solving multicriteria optimization problems on Riemannian manifolds
- Riemannian stochastic variance-reduced cubic regularized Newton method for submanifold optimization
- Adaptive trust-region method on Riemannian manifold
- AN EFFICIENT METHOD FOR SOLVING A CLASS OF MATRIX TRACE FUNCTION MINIMIZATION PROBLEM IN MULTIVARIATE STATISTICAL
- A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds
- Exact penalty function for \(\ell_{2,1}\) norm minimization over the Stiefel manifold
- Effective algorithms for solving trace minimization problem in multivariate statistics
- Faster Riemannian Newton-type optimization by subsampling and cubic regularization
- Riemannian trust region methods for \(\mathrm{SC}^1\) minimization
- Structured Quasi-Newton Methods for Optimization with Orthogonality Constraints
- Adaptive strategy for the damping parameters in an iteratively regularized Gauss-Newton method
- Sequential quadratic optimization for nonlinear optimization problems on Riemannian manifolds
- A penalty-free infeasible approach for a class of nonsmooth optimization problems over the Stiefel manifold
- A Riemannian dimension-reduced second-order method with application in sensor network localization
- A Decomposition Augmented Lagrangian Method for Low-Rank Semidefinite Programming
- Riemannian Natural Gradient Methods
This page was built for publication: Adaptive quadratically regularized Newton method for Riemannian optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3176355)