A coordinate gradient descent method for nonsmooth separable minimization
DOI10.1007/S10107-007-0170-0zbMATH Open1166.90016OpenAlexW2039050532MaRDI QIDQ959979FDOQ959979
Authors: Paul Tseng, Sangwoon Yun
Publication date: 16 December 2008
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10107-007-0170-0
Recommendations
- A coordinate gradient descent method for nonsmooth nonseparable minimization
- A coordinate gradient descent method for \(\ell_{1}\)-regularized convex minimization
- On the iteration complexity of cyclic coordinate gradient descent methods
- Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization
- A block coordinate gradient descent method for regularized convex separable optimization and covariance selection
Numerical mathematical programming methods (65K05) Convex programming (90C25) Methods of successive quadratic programming type (90C55) Large-scale problems in mathematical programming (90C06) Nonconvex programming, global optimization (90C26) Nonlinear programming (90C30) Numerical methods based on nonlinear programming (49M37) Decomposition methods (49M27)
Cites Work
- Testing Unconstrained Optimization Software
- Algorithm 778: L-BFGS-B
- CUTEr and SifDec
- Numerical Optimization
- Ideal spatial adaptation by wavelet shrinkage
- Convergence of a block coordinate descent method for nondifferentiable minimization
- Adapting to Unknown Smoothness via Wavelet Shrinkage
- Atomic Decomposition by Basis Pursuit
- Variational Analysis
- Model Selection and Estimation in Regression with Grouped Variables
- The Group Lasso for Logistic Regression
- Updating Quasi-Newton Matrices with Limited Storage
- Title not available (Why is that?)
- Convex Analysis
- A coordinate gradient descent method for nonsmooth separable minimization
- Title not available (Why is that?)
- Title not available (Why is that?)
- A method for minimizing the sum of a convex function and a continuously differentiable function
- A minimization method for the sum of a convex function and a continuously differentiable function
- A successive quadratic programming method for a class of constrained nonsmooth optimization problems
- Linear convergence of epsilon-subgradient descent methods for a class of convex functions
- On the convergence of the block nonlinear Gauss-Seidel method under convex constraints
- Title not available (Why is that?)
- Some continuity properties of polyhedral multifunctions
- A generalized proximal point algorithm for certain non-convex minimization problems
- Title not available (Why is that?)
- Trust Region Methods
- Finite-Dimensional Variational Inequalities and Complementarity Problems
- Title not available (Why is that?)
- Title not available (Why is that?)
- On the Accurate Identification of Active Constraints
- Iterative Solution of Nonlinear Equations in Several Variables
- On search directions for minimization algorithms
- Error bounds and convergence analysis of feasible descent methods: A general approach
- On the Linear Convergence of Descent Methods for Convex Essentially Smooth Minimization
- Descent methods for composite nondifferentiable optimization problems
- Error Bound and Convergence Analysis of Matrix Splitting Algorithms for the Affine Variational Inequality Problem
- Mathematical Programming for Data Mining: Formulations and Challenges
- On the Solution of Large Quadratic Programming Problems with Bound Constraints
- Parallel Variable Transformation in Unconstrained Optimization
- Parallel Gradient Distribution in Unconstrained Optimization
- Parallel Variable Distribution
- On the Convergence Rate of Dual Ascent Methods for Linearly Constrained Convex Minimization
- Dual coordinate ascent methods for non-strictly convex minimization
- A model algorithm for composite nondifferentiable optimization problems
- Large scale kernel regression via linear programming
- Sparsity-preserving SOR algorithms for separable quadratic and linear programming
- On the Statistical Analysis of Smoothing by Maximizing Dirty Markov Random Field Posterior Distributions
- Parallel gradient projection successive overrelaxation for symmetric linear complementarity problems and linear programs
- On the Rate of Convergence of a Partially Asynchronous Gradient Projection Algorithm
Cited In (only showing first 100 items - show all)
- Non-overlapping domain decomposition methods for dual total variation based image denoising
- Structured regularization for conditional Gaussian graphical models
- Blocks of coordinates, stochastic programming, and markets
- Stochastic block-coordinate gradient projection algorithms for submodular maximization
- Alternating direction method of multipliers with variable metric indefinite proximal terms for convex optimization
- A parallel line search subspace correction method for composite convex optimization
- Asynchronous parallel primal-dual block coordinate update methods for affinely constrained convex programs
- Block decomposition methods for total variation by primal-dual stitching
- SOR- and Jacobi-type iterative methods for solving \(\ell_1 - \ell_2\) problems by way of Fenchel duality
- Calculus of the exponent of Kurdyka-Łojasiewicz inequality and its applications to linear convergence of first-order methods
- Title not available (Why is that?)
- On the iteration complexity of cyclic coordinate gradient descent methods
- Inexact successive quadratic approximation for regularized optimization
- Sequential threshold control in descent splitting methods for decomposable optimization problems
- A family of inexact SQA methods for non-smooth convex minimization with provable convergence guarantees based on the Luo-Tseng error bound property
- Extended ADMM and BCD for nonseparable convex minimization models with quadratic coupling terms: convergence analysis and insights
- Oracle inequalities for sparse additive quantile regression in reproducing kernel Hilbert space
- Nonparametric additive model with grouped Lasso and maximizing area under the ROC curve
- A fast active set block coordinate descent algorithm for \(\ell_1\)-regularized least squares
- A truncated Newton algorithm for nonconvex sparse recovery
- A Stochastic Semismooth Newton Method for Nonsmooth Nonconvex Optimization
- A new spectral method for \(l_1\)-regularized minimization
- Distributed Block Coordinate Descent for Minimizing Partially Separable Functions
- Gradient-based method with active set strategy for $\ell _1$ optimization
- Further properties of the forward-backward envelope with applications to difference-of-convex programming
- A flexible coordinate descent method
- Non-concave penalization in linear mixed-effect models and regularized selection of fixed effects
- On the convergence of asynchronous parallel iteration with unbounded delays
- (Robust) edge-based semidefinite programming relaxation of sensor network localization
- A First-Order Optimization Algorithm for Statistical Learning with Hierarchical Sparsity Structure
- Low Complexity Regularization of Linear Inverse Problems
- Visualizing the effects of a changing distance on data using continuous embeddings
- Activity Identification and Local Linear Convergence of Forward--Backward-type Methods
- A globally convergent algorithm for nonconvex optimization based on block coordinate update
- A block coordinate variable metric linesearch based proximal gradient method
- Linearity identification for general partial linear single-index models
- A regularized semi-smooth Newton method with projection steps for composite convex programs
- Linear Convergence of Proximal Gradient Algorithm with Extrapolation for a Class of Nonconvex Nonsmooth Minimization Problems
- On the linear convergence of forward-backward splitting method. I: Convergence analysis
- An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization
- Error bounds for non-polyhedral convex optimization and applications to linear convergence of FDM and PGM
- On Faster Convergence of Cyclic Block Coordinate Descent-type Methods for Strongly Convex Minimization
- A unified approach to error bounds for structured convex optimization problems
- A Highly Efficient Semismooth Newton Augmented Lagrangian Method for Solving Lasso Problems
- Hybrid Jacobian and Gauss--Seidel Proximal Block Coordinate Update Methods for Linearly Constrained Convex Programming
- On proximal gradient method for the convex problems regularized with the group reproducing kernel norm
- Orbital minimization method with \(\ell^{1}\) regularization
- Nonmonotone gradient methods for vector optimization with a portfolio optimization application
- A block coordinate gradient descent method for regularized convex separable optimization and covariance selection
- Variable metric inexact line-search-based methods for nonsmooth optimization
- Distributed block-diagonal approximation methods for regularized empirical risk minimization
- Optimization Methods for Large-Scale Machine Learning
- A Coordinate-Descent Primal-Dual Algorithm with Large Step Size and Possibly Nonseparable Functions
- Combining line search and trust-region methods forℓ1-minimization
- Distributionally Robust Inverse Covariance Estimation: The Wasserstein Shrinkage Estimator
- A block symmetric Gauss-Seidel decomposition theorem for convex composite quadratic programming and its applications
- RSG: Beating Subgradient Method without Smoothness and Strong Convexity
- A coordinate gradient descent method for nonsmooth nonseparable minimization
- On stochastic mirror-prox algorithms for stochastic Cartesian variational inequalities: randomized block coordinate and optimal averaging schemes
- Sparse Minimum Discrepancy Approach to Sufficient Dimension Reduction with Simultaneous Variable Selection in Ultrahigh Dimension
- Multi-objective isogeometric integrated optimization for shape control of piezoelectric functionally graded plates
- Nomonotone spectral gradient method for sparse recovery
- A Block Successive Upper-Bound Minimization Method of Multipliers for Linearly Constrained Convex Optimization
- A second-order method for strongly convex \(\ell _1\)-regularization problems
- Inexact coordinate descent: complexity and preconditioning
- Multi-block Bregman proximal alternating linearized minimization and its application to orthogonal nonnegative matrix factorization
- Majorization-minimization algorithms for nonsmoothly penalized objective functions
- Variable selection for sparse Dirichlet-multinomial regression with an application to microbiome data analysis
- Approximation accuracy, gradient methods, and error bound for structured convex optimization
- A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints
- A new generalized shrinkage conjugate gradient method for sparse recovery
- A generic coordinate descent solver for non-smooth convex optimisation
- Robust Variable Selection With Exponential Squared Loss
- A multilevel framework for sparse optimization with application to inverse covariance estimation and logistic regression
- Sparse group Lasso and high dimensional multinomial classification
- Robust sparse Gaussian graphical modeling
- The 2-coordinate descent method for solving double-sided simplex constrained minimization problems
- General inertial proximal gradient method for a class of nonconvex nonsmooth optimization problems
- Robust Gaussian graphical modeling via \(l_{1}\) penalization
- Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization
- Block Coordinate Descent Methods for Semidefinite Programming
- Projection onto a polyhedron that exploits sparsity
- A coordinate gradient descent method for linearly constrained smooth optimization and support vector machines training
- Incrementally updated gradient methods for constrained and regularized optimization
- Fused Multiple Graphical Lasso
- On some steplength approaches for proximal algorithms
- On the convergence of an active-set method for \(\ell_1\) minimization
- Coordinate and subspace optimization methods for linear least squares with non-quadratic regularization
- An efficient inexact ABCD method for least squares semidefinite programming
- Estimation for high-dimensional linear mixed-effects models using \(\ell_1\)-penalization
- Efficient random coordinate descent algorithms for large-scale structured nonconvex optimization
- A coordinate gradient descent method for \(\ell_{1}\)-regularized convex minimization
- Practical inexact proximal quasi-Newton method with global complexity analysis
- The Group Lasso for Logistic Regression
- An iterative approach for cone complementarity problems for nonsmooth dynamics
- On the linear convergence of a proximal gradient method for a class of nonsmooth convex minimization problems
- Solution path clustering with adaptive concave penalty
- On the complexity analysis of randomized block-coordinate descent methods
- The Variable Metric Forward-Backward Splitting Algorithm Under Mild Differentiability Assumptions
- Asynchronous Stochastic Coordinate Descent: Parallelism and Convergence Properties
Uses Software
This page was built for publication: A coordinate gradient descent method for nonsmooth separable minimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q959979)