Block coordinate proximal gradient methods with variable Bregman functions for nonsmooth separable optimization
DOI10.1007/S10107-015-0969-ZzbMATH Open1352.49032OpenAlexW2279141731MaRDI QIDQ344922FDOQ344922
Authors: Xiaoqin Hua, Nobuo Yamashita
Publication date: 25 November 2016
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10107-015-0969-z
Recommendations
- A coordinate gradient descent method for nonsmooth separable minimization
- A unified convergence analysis of block successive minimization methods for nonsmooth optimization
- Block-coordinate and incremental aggregated proximal gradient methods for nonsmooth nonconvex problems
- A coordinate gradient descent method for nonsmooth nonseparable minimization
- A block coordinate gradient descent method for regularized convex separable optimization and covariance selection
Numerical mathematical programming methods (65K05) Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Nonconvex programming, global optimization (90C26) Nonlinear programming (90C30) Numerical methods based on nonlinear programming (49M37)
Cites Work
- Pathwise coordinate optimization
- Coordinate descent algorithms for lasso penalized regression
- Convergence of a block coordinate descent method for nondifferentiable minimization
- On the Implementation of a Primal-Dual Interior Point Method
- Atomic Decomposition by Basis Pursuit
- Model Selection and Estimation in Regression with Grouped Variables
- Introductory lectures on convex optimization. A basic course.
- The Group Lasso for Logistic Regression
- Mirror descent and nonlinear projected subgradient methods for convex optimization.
- A coordinate gradient descent method for nonsmooth separable minimization
- Approximation accuracy, gradient methods, and error bound for structured convex optimization
- Accelerated block-coordinate relaxation for regularized optimization
- Convergence Analysis of a Proximal-Like Minimization Algorithm Using Bregman Functions
- Title not available (Why is that?)
- Title not available (Why is that?)
- A sequential quadratic programming algorithm for nonconvex, nonsmooth constrained optimization
- Title not available (Why is that?)
- On the Solution of Large Quadratic Programming Problems with Bound Constraints
Cited In (6)
- Blocks of coordinates, stochastic programming, and markets
- Bregman methods for large-scale optimization with applications in imaging
- Testing and non-linear preconditioning of the proximal point method
- Block coordinate descent for smooth nonconvex constrained minimization
- Emergence of price-taking behavior
- The convergence properties of infeasible inexact proximal alternating linearized minimization
Uses Software
This page was built for publication: Block coordinate proximal gradient methods with variable Bregman functions for nonsmooth separable optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q344922)