A New Homotopy Proximal Variable-Metric Framework for Composite Convex Minimization
From MaRDI portal
Publication:5076711
DOI10.1287/moor.2021.1138zbMath1498.90159arXiv1812.05243OpenAlexW3204737734MaRDI QIDQ5076711
Ling Liang, Quoc Tran Dinh, Kim-Chuan Toh
Publication date: 17 May 2022
Published in: Mathematics of Operations Research (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1812.05243
homotopy methodlinear convergence ratecomposite convex minimizationfinite iteration complexityprimal-dual-primal frameworkproximal variable-metric algorithm
Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Sensitivity, stability, parametric optimization (90C31) Computational methods for problems pertaining to operations research and mathematical programming (90-08)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
- Distance-Weighted Discrimination
- Fast Algorithms for Large-Scale Generalized Distance Weighted Discrimination
- Smooth minimization of non-smooth functions
- Sparse inverse covariance estimation with the graphical lasso
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Gradient methods for minimizing composite functions
- Fast alternating linearization methods for minimizing the sum of two convex functions
- First-order methods of smooth convex optimization with inexact oracle
- SDPNAL+: a majorized semismooth Newton-CG augmented Lagrangian method for semidefinite programming with nonnegative constraints
- Efficient evaluation of scaled proximal operators
- An inexact interior point method for \(L_{1}\)-regularized sparse covariance selection
- Accelerating the cubic regularization of Newton's method on convex problems
- Introductory lectures on convex optimization. A basic course.
- Templates for convex cone problems with applications to sparse signal recovery
- A first-order primal-dual algorithm for convex problems with applications to imaging
- Finite-sample analysis of \(M\)-estimators using self-concordance
- Self-concordant inclusions: a unified framework for path-following generalized Newton-type algorithms
- Sparse Poisson regression with penalized weighted score function
- Generalized self-concordant functions: a recipe for Newton-type methods
- Coordinate descent algorithms
- Forward-backward quasi-Newton methods for nonsmooth optimization problems
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Cubic regularization of Newton method and its global performance
- Adaptive Lasso and group-Lasso for functional Poisson regression
- Smoothing and First Order Methods: A Unified Framework
- Proximal Newton-Type Methods for Minimizing Composite Functions
- Convergence Rate Analysis of the Forward-Douglas-Rachford Splitting Scheme
- Efficiency of the Accelerated Coordinate Descent Method on Structured Optimization Problems
- A Newton-CG Augmented Lagrangian Method for Semidefinite Programming
- Newton Methods for Nonlinear Problems
- The Split Bregman Method for L1-Regularized Problems
- Accelerated, Parallel, and Proximal Coordinate Descent
- Approximate D-optimal designs of experiments on the convex hull of a finite set of information matrices
- Sparse Reconstruction by Separable Approximation
- A Smooth Primal-Dual Optimization Framework for Nonsmooth Composite Convex Minimization
- A Highly Efficient Semismooth Newton Augmented Lagrangian Method for Solving Lasso Problems
- Rate of Convergence Analysis of Decomposition Methods Based on the Proximal Method of Multipliers for Convex Minimization
- An Inexact Proximal Path-Following Algorithm for Constrained Convex Minimization
- Regularization and Variable Selection Via the Elastic Net
- Fast Gradient-Based Algorithms for Constrained Total Variation Image Denoising and Deblurring Problems
- Poisson Image Reconstruction With Hessian Schatten-Norm Regularization
- Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
- Computing Optimal Experimental Designs via Interior Point Method
- Composite Self-Concordant Minimization
- IMRO: A Proximal Quasi-Newton Method for Solving $\ell_1$-Regularized Least Squares Problems
- Convex analysis and monotone operator theory in Hilbert spaces