Generalized self-concordant functions: a recipe for Newton-type methods
From MaRDI portal
Publication:2330645
DOI10.1007/s10107-018-1282-4zbMath1430.90464arXiv1703.04599OpenAlexW2603538444MaRDI QIDQ2330645
Publication date: 22 October 2019
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1703.04599
global convergenceconvex optimizationquadratic convergencelocal convergenceNewton-type methodproximal Newton methodgeneralized self-concordance
Convex programming (90C25) Computational methods for problems pertaining to operations research and mathematical programming (90-08)
Related Items (11)
A Newton Frank-Wolfe method for constrained self-concordant minimization ⋮ A New Homotopy Proximal Variable-Metric Framework for Composite Convex Minimization ⋮ Semi-discrete optimal transport: hardness, regularization and numerical solution ⋮ SCORE: approximating curvature information under self-concordant regularization ⋮ The method of randomized Bregman projections for stochastic feasibility problems ⋮ Generalized self-concordant analysis of Frank-Wolfe algorithms ⋮ Differentially private inference via noisy optimization ⋮ Finite-sample analysis of \(M\)-estimators using self-concordance ⋮ Composite convex optimization with global and local inexact oracles ⋮ Greedy Quasi-Newton Methods with Explicit Superlinear Convergence ⋮ Unnamed Item
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Finding structure with randomness: Probabilistic algorithms for constructing approximate matrix decompositions
- Distance-Weighted Discrimination
- A Stochastic Quasi-Newton Method for Large-Scale Optimization
- Smooth minimization of non-smooth functions
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Gradient methods for minimizing composite functions
- Efficient evaluation of scaled proximal operators
- Accelerating the cubic regularization of Newton's method on convex problems
- Regularized Newton method for unconstrained convex optimization
- Local analysis of Newton-type methods for variational inequalities and nonlinear programming
- Introductory lectures on convex optimization. A basic course.
- Sub-sampled Newton methods
- Templates for convex cone problems with applications to sparse signal recovery
- Methods for scaling to doubly stochastic form
- Self-concordant analysis for logistic regression
- Balancing sparse matrices for computing eigenvalues
- Path-following gradient-based decomposition algorithms for separable convex optimization
- Restarting the accelerated coordinate descent method with a rough strong convexity estimate
- Self-concordant inclusions: a unified framework for path-following generalized Newton-type algorithms
- Adaptive restart for accelerated gradient schemes
- Cubic regularization of Newton method and its global performance
- A class of stochastic programs with decision dependent uncertainty
- On the Implementation and Usage of SDPT3 – A Matlab Software Package for Semidefinite-Quadratic-Linear Programming, Version 4.0
- Adaptivity of averaged stochastic gradient descent to local strong convexity for logistic regression
- Proximal Newton-Type Methods for Minimizing Composite Functions
- Newton Sketch: A Near Linear-Time Optimization Algorithm with Linear-Quadratic Convergence
- Newton Methods for Nonlinear Problems
- A Hybrid Proximal Extragradient Self-Concordant Primal Barrier Method for Monotone Variational Inequalities
- Strongly Regular Generalized Equations
- Quasi-Newton methods: superlinear convergence without line searches for self-concordant functions
- Implementation and evaluation of SDPA 6.0 (Semidefinite Programming Algorithm 6.0)
- A Characterization of Superlinear Convergence and Its Application to Quasi-Newton Methods
- Iterative Solution of Nonlinear Equations in Several Variables
- Time-Optimal Path Tracking for Robots: A Convex Optimization Approach
- An Inexact Perturbed Path-Following Method for Lagrangian Decomposition in Large-Scale Separable Convex Optimization
- Randomized Block Proximal Damped Newton Method for Composite Self-Concordant Minimization
- Composite Convex Minimization Involving Self-concordant-Like Cost Functions
- Composite Self-Concordant Minimization
- Exact and inexact subsampled Newton methods for optimization
- Convex analysis and monotone operator theory in Hilbert spaces
- Benchmarking optimization software with performance profiles.
This page was built for publication: Generalized self-concordant functions: a recipe for Newton-type methods