Randomized block proximal damped Newton method for composite self-concordant minimization
DOI10.1137/16M1082767zbMATH Open1375.49040arXiv1607.00101OpenAlexW2963930582MaRDI QIDQ5355205FDOQ5355205
Authors: Zhaosong Lu
Publication date: 7 September 2017
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1607.00101
Recommendations
- Generalized self-concordant functions: a recipe for Newton-type methods
- Composite self-concordant minimization
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- An accelerated randomized proximal coordinate gradient method and its application to regularized empirical risk minimization
- Newton Sketch: A Near Linear-Time Optimization Algorithm with Linear-Quadratic Convergence
convergenceconvex optimizationiteration complexitydamped Newton methodcomposite self-concordant minimizationproximal damped Newton methodrandomized block proximal damped Newton method
Numerical mathematical programming methods (65K05) Learning and adaptive systems in artificial intelligence (68T05) Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Interior-point methods (90C51) Methods involving semicontinuity and convergence; relaxation (49J45) Newton-type methods (49M15)
Cites Work
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Probing the Pareto frontier for basis pursuit solutions
- Title not available (Why is that?)
- Introductory lectures on convex optimization. A basic course.
- Sparse inverse covariance estimation with the graphical lasso
- Model selection and estimation in the Gaussian graphical model
- Gradient methods for minimizing composite functions
- First-Order Methods for Sparse Covariance Selection
- A coordinate gradient descent method for nonsmooth separable minimization
- Sparse Reconstruction by Separable Approximation
- Alternating direction algorithms for \(\ell_1\)-problems in compressive sensing
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Accelerated block-coordinate relaxation for regularized optimization
- Efficiency of coordinate descent methods on huge-scale optimization problems
- Coordinate descent method for large-scale L2-loss linear support vector machines
- Randomized methods for linear constraints: convergence rates and conditioning
- Accelerated, parallel, and proximal coordinate descent
- An Asynchronous Parallel Stochastic Coordinate Descent Algorithm
- Stochastic dual coordinate ascent methods for regularized loss minimization
- On the convergence of block coordinate descent type methods
- Adaptive First-Order Methods for General Sparse Inverse Covariance Selection
- Fixed-Point Continuation Applied to Compressed Sensing: Implementation and Numerical Experiments
- Efficient random coordinate descent algorithms for large-scale structured nonconvex optimization
- A semismooth Newton method with multidimensional filter globalization for \(l_1\)-optimization
- An algorithm for quadratic \(\ell_1\)-regularized optimization with a flexible active-set strategy
- On the complexity analysis of randomized block-coordinate descent methods
- Block coordinate descent methods for semidefinite programming
- Composite self-concordant minimization
- A proximal-gradient homotopy method for the sparse least-squares problem
- Iteration complexity analysis of block coordinate descent methods
- Proximal Newton-type methods for minimizing composite functions
- An inexact proximal path-following algorithm for constrained convex minimization
- An accelerated randomized proximal coordinate gradient method and its application to regularized empirical risk minimization
- A highly efficient semismooth Newton augmented Lagrangian method for solving lasso problems
- On efficiently solving the subproblems of a level-set method for fused lasso problems
- Communication-efficient distributed optimization of self-concordant empirical loss
- A randomized nonmonotone block proximal gradient method for a class of structured nonlinear programming
Cited In (4)
Uses Software
This page was built for publication: Randomized block proximal damped Newton method for composite self-concordant minimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5355205)