Generalized self-concordant functions: a recipe for Newton-type methods
From MaRDI portal
Publication:2330645
Abstract: We study the smooth structure of convex functions by generalizing a powerful concept so-called self-concordance introduced by Nesterov and Nemirovskii in the early 1990s to a broader class of convex functions, which we call generalized self-concordant functions. This notion allows us to develop a unified framework for designing Newton-type methods to solve convex optimiza- tion problems. The proposed theory provides a mathematical tool to analyze both local and global convergence of Newton-type methods without imposing unverifiable assumptions as long as the un- derlying functionals fall into our generalized self-concordant function class. First, we introduce the class of generalized self-concordant functions, which covers standard self-concordant functions as a special case. Next, we establish several properties and key estimates of this function class, which can be used to design numerical methods. Then, we apply this theory to develop several Newton-type methods for solving a class of smooth convex optimization problems involving the generalized self- concordant functions. We provide an explicit step-size for the damped-step Newton-type scheme which can guarantee a global convergence without performing any globalization strategy. We also prove a local quadratic convergence of this method and its full-step variant without requiring the Lipschitz continuity of the objective Hessian. Then, we extend our result to develop proximal Newton-type methods for a class of composite convex minimization problems involving generalized self-concordant functions. We also achieve both global and local convergence without additional assumption. Finally, we verify our theoretical results via several numerical examples, and compare them with existing methods.
Recommendations
- Self-concordant inclusions: a unified framework for path-following generalized Newton-type algorithms
- Composite convex minimization involving self-concordant-like cost functions
- A Newton Frank-Wolfe method for constrained self-concordant minimization
- Globalized inexact proximal Newton-type methods for nonconvex composite functions
- Generalized self-concordant analysis of Frank-Wolfe algorithms
Cites work
- scientific article; zbMATH DE number 47310 (Why is no real title available?)
- scientific article; zbMATH DE number 1332320 (Why is no real title available?)
- scientific article; zbMATH DE number 729680 (Why is no real title available?)
- scientific article; zbMATH DE number 1547349 (Why is no real title available?)
- scientific article; zbMATH DE number 1862745 (Why is no real title available?)
- scientific article; zbMATH DE number 2107836 (Why is no real title available?)
- scientific article; zbMATH DE number 5060482 (Why is no real title available?)
- scientific article; zbMATH DE number 2243362 (Why is no real title available?)
- A Characterization of Superlinear Convergence and Its Application to Quasi-Newton Methods
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- A class of stochastic programs with decision dependent uncertainty
- A hybrid proximal extragradient self-concordant primal barrier method for monotone variational inequalities
- A stochastic quasi-Newton method for large-scale optimization
- Accelerating the cubic regularization of Newton's method on convex problems
- Adaptive restart for accelerated gradient schemes
- Adaptivity of averaged stochastic gradient descent to local strong convexity for logistic regression
- An inexact perturbed path-following method for Lagrangian decomposition in large-scale separable convex optimization
- Balancing sparse matrices for computing eigenvalues
- Benchmarking optimization software with performance profiles.
- Composite convex minimization involving self-concordant-like cost functions
- Composite self-concordant minimization
- Convex analysis and monotone operator theory in Hilbert spaces
- Cubic regularization of Newton method and its global performance
- Disciplined convex programming
- Distance-Weighted Discrimination
- Efficient evaluation of scaled proximal operators
- Exact and inexact subsampled Newton methods for optimization
- Finding structure with randomness: probabilistic algorithms for constructing approximate matrix decompositions
- Gradient methods for minimizing composite functions
- Implementation and evaluation of SDPA 6.0 (Semidefinite Programming Algorithm 6.0)
- Introductory lectures on convex optimization. A basic course.
- Iterative Solution of Nonlinear Equations in Several Variables
- Local analysis of Newton-type methods for variational inequalities and nonlinear programming
- Methods for scaling to doubly stochastic form
- Newton Sketch: A Near Linear-Time Optimization Algorithm with Linear-Quadratic Convergence
- Newton methods for nonlinear problems. Affine invariance and adaptive algorithms.
- On the implementation and usage of SDPT3 -- a Matlab software package for semidefinite-quadratic-linear programming, version 4.0
- Path-following gradient-based decomposition algorithms for separable convex optimization
- Proximal Newton-type methods for minimizing composite functions
- Quasi-Newton methods: superlinear convergence without line searches for self-concordant functions
- Randomized block proximal damped Newton method for composite self-concordant minimization
- Regularized Newton method for unconstrained convex optimization
- Restarting the accelerated coordinate descent method with a rough strong convexity estimate
- Self-concordant analysis for logistic regression
- Self-concordant inclusions: a unified framework for path-following generalized Newton-type algorithms
- Smooth minimization of non-smooth functions
- Strongly Regular Generalized Equations
- Sub-sampled Newton methods
- Templates for convex cone problems with applications to sparse signal recovery
- Time-Optimal Path Tracking for Robots: A Convex Optimization Approach
Cited in
(16)- Randomized block proximal damped Newton method for composite self-concordant minimization
- Composite convex optimization with global and local inexact oracles
- The method of randomized Bregman projections for stochastic feasibility problems
- Finite-sample analysis of \(M\)-estimators using self-concordance
- Self-concordant inclusions: a unified framework for path-following generalized Newton-type algorithms
- Scalable Frank-Wolfe on generalized self-concordant functions via simple steps
- Semi-discrete optimal transport: hardness, regularization and numerical solution
- A new homotopy proximal variable-metric framework for composite convex minimization
- Composite convex minimization involving self-concordant-like cost functions
- SCORE: approximating curvature information under self-concordant regularization
- Greedy quasi-Newton methods with explicit superlinear convergence
- scientific article; zbMATH DE number 7415081 (Why is no real title available?)
- Optimal step length for the Newton method: case of self-concordant functions
- A Newton Frank-Wolfe method for constrained self-concordant minimization
- Generalized self-concordant analysis of Frank-Wolfe algorithms
- Differentially private inference via noisy optimization
This page was built for publication: Generalized self-concordant functions: a recipe for Newton-type methods
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2330645)