Block coordinate proximal gradient methods with variable Bregman functions for nonsmooth separable optimization
From MaRDI portal
(Redirected from Publication:344922)
Recommendations
- A coordinate gradient descent method for nonsmooth separable minimization
- A unified convergence analysis of block successive minimization methods for nonsmooth optimization
- Block-coordinate and incremental aggregated proximal gradient methods for nonsmooth nonconvex problems
- A coordinate gradient descent method for nonsmooth nonseparable minimization
- A block coordinate gradient descent method for regularized convex separable optimization and covariance selection
Cites work
- scientific article; zbMATH DE number 2005717 (Why is no real title available?)
- A coordinate gradient descent method for nonsmooth separable minimization
- A sequential quadratic programming algorithm for nonconvex, nonsmooth constrained optimization
- Accelerated block-coordinate relaxation for regularized optimization
- An interior-point method for large-scale \(l_1\)-regularized logistic regression
- Approximation accuracy, gradient methods, and error bound for structured convex optimization
- Atomic Decomposition by Basis Pursuit
- Convergence Analysis of a Proximal-Like Minimization Algorithm Using Bregman Functions
- Convergence of a block coordinate descent method for nondifferentiable minimization
- Coordinate descent algorithms for lasso penalized regression
- Exponentiated gradient algorithms for conditional random fields and max-margin Markov networks
- Introductory lectures on convex optimization. A basic course.
- Mirror descent and nonlinear projected subgradient methods for convex optimization.
- Model Selection and Estimation in Regression with Grouped Variables
- On the Implementation of a Primal-Dual Interior Point Method
- On the Solution of Large Quadratic Programming Problems with Bound Constraints
- Pathwise coordinate optimization
- The Group Lasso for Logistic Regression
Cited in
(11)- A block coordinate gradient descent method for regularized convex separable optimization and covariance selection
- Block coordinate descent for smooth nonconvex constrained minimization
- The convergence properties of infeasible inexact proximal alternating linearized minimization
- Blocks of coordinates, stochastic programming, and markets
- Emergence of price-taking behavior
- Bregman methods for large-scale optimization with applications in imaging
- Block Bregman majorization minimization with extrapolation
- Analysis of a variable metric block coordinate method under proximal errors
- Truncated nonsmooth Newton multigrid methods for block-separable minimization problems
- Testing and non-linear preconditioning of the proximal point method
- A unified convergence analysis of block successive minimization methods for nonsmooth optimization
This page was built for publication: Block coordinate proximal gradient methods with variable Bregman functions for nonsmooth separable optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q344922)