A coordinate gradient descent method for nonsmooth separable minimization
DOI10.1007/S10107-007-0170-0zbMATH Open1166.90016OpenAlexW2039050532MaRDI QIDQ959979FDOQ959979
Authors: Paul Tseng, Sangwoon Yun
Publication date: 16 December 2008
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10107-007-0170-0
Recommendations
- A coordinate gradient descent method for nonsmooth nonseparable minimization
- A coordinate gradient descent method for \(\ell_{1}\)-regularized convex minimization
- On the iteration complexity of cyclic coordinate gradient descent methods
- Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization
- A block coordinate gradient descent method for regularized convex separable optimization and covariance selection
Numerical mathematical programming methods (65K05) Convex programming (90C25) Methods of successive quadratic programming type (90C55) Large-scale problems in mathematical programming (90C06) Nonconvex programming, global optimization (90C26) Nonlinear programming (90C30) Numerical methods based on nonlinear programming (49M37) Decomposition methods (49M27)
Cites Work
- Testing Unconstrained Optimization Software
- Algorithm 778: L-BFGS-B
- CUTEr and SifDec
- Numerical Optimization
- Ideal spatial adaptation by wavelet shrinkage
- Convergence of a block coordinate descent method for nondifferentiable minimization
- Adapting to Unknown Smoothness via Wavelet Shrinkage
- Atomic Decomposition by Basis Pursuit
- Variational Analysis
- Model Selection and Estimation in Regression with Grouped Variables
- The Group Lasso for Logistic Regression
- Updating Quasi-Newton Matrices with Limited Storage
- Title not available (Why is that?)
- Convex Analysis
- A coordinate gradient descent method for nonsmooth separable minimization
- Title not available (Why is that?)
- Title not available (Why is that?)
- A method for minimizing the sum of a convex function and a continuously differentiable function
- A minimization method for the sum of a convex function and a continuously differentiable function
- A successive quadratic programming method for a class of constrained nonsmooth optimization problems
- Linear convergence of epsilon-subgradient descent methods for a class of convex functions
- On the convergence of the block nonlinear Gauss-Seidel method under convex constraints
- Title not available (Why is that?)
- Some continuity properties of polyhedral multifunctions
- A generalized proximal point algorithm for certain non-convex minimization problems
- Title not available (Why is that?)
- Trust Region Methods
- Finite-Dimensional Variational Inequalities and Complementarity Problems
- Title not available (Why is that?)
- Title not available (Why is that?)
- On the Accurate Identification of Active Constraints
- Iterative Solution of Nonlinear Equations in Several Variables
- On search directions for minimization algorithms
- Error bounds and convergence analysis of feasible descent methods: A general approach
- On the Linear Convergence of Descent Methods for Convex Essentially Smooth Minimization
- Descent methods for composite nondifferentiable optimization problems
- Error Bound and Convergence Analysis of Matrix Splitting Algorithms for the Affine Variational Inequality Problem
- Mathematical Programming for Data Mining: Formulations and Challenges
- On the Solution of Large Quadratic Programming Problems with Bound Constraints
- Parallel Variable Transformation in Unconstrained Optimization
- Parallel Gradient Distribution in Unconstrained Optimization
- Parallel Variable Distribution
- On the Convergence Rate of Dual Ascent Methods for Linearly Constrained Convex Minimization
- Dual coordinate ascent methods for non-strictly convex minimization
- A model algorithm for composite nondifferentiable optimization problems
- Large scale kernel regression via linear programming
- Sparsity-preserving SOR algorithms for separable quadratic and linear programming
- On the Statistical Analysis of Smoothing by Maximizing Dirty Markov Random Field Posterior Distributions
- Parallel gradient projection successive overrelaxation for symmetric linear complementarity problems and linear programs
- On the Rate of Convergence of a Partially Asynchronous Gradient Projection Algorithm
Cited In (only showing first 100 items - show all)
- Non-overlapping domain decomposition methods for dual total variation based image denoising
- Structured regularization for conditional Gaussian graphical models
- Blocks of coordinates, stochastic programming, and markets
- Stochastic block-coordinate gradient projection algorithms for submodular maximization
- Alternating direction method of multipliers with variable metric indefinite proximal terms for convex optimization
- A parallel line search subspace correction method for composite convex optimization
- Asynchronous parallel primal-dual block coordinate update methods for affinely constrained convex programs
- Block decomposition methods for total variation by primal-dual stitching
- SOR- and Jacobi-type iterative methods for solving \(\ell_1 - \ell_2\) problems by way of Fenchel duality
- Calculus of the exponent of Kurdyka-Łojasiewicz inequality and its applications to linear convergence of first-order methods
- Title not available (Why is that?)
- A block-coordinate descent method for linearly constrained minimization problem
- On the iteration complexity of cyclic coordinate gradient descent methods
- Inexact successive quadratic approximation for regularized optimization
- A highly efficient semismooth Newton augmented Lagrangian method for solving lasso problems
- Hybrid Jacobian and Gauss-Seidel proximal block coordinate update methods for linearly constrained convex programming
- On faster convergence of cyclic block coordinate descent-type methods for strongly convex minimization
- Sequential threshold control in descent splitting methods for decomposable optimization problems
- A family of inexact SQA methods for non-smooth convex minimization with provable convergence guarantees based on the Luo-Tseng error bound property
- Extended ADMM and BCD for nonseparable convex minimization models with quadratic coupling terms: convergence analysis and insights
- Oracle inequalities for sparse additive quantile regression in reproducing kernel Hilbert space
- Distributionally robust inverse covariance estimation: the Wasserstein shrinkage estimator
- Nonparametric additive model with grouped Lasso and maximizing area under the ROC curve
- A fast active set block coordinate descent algorithm for \(\ell_1\)-regularized least squares
- The cyclic block conditional gradient method for convex optimization problems
- Optimization methods for large-scale machine learning
- A coordinate-descent primal-dual algorithm with large step size and possibly nonseparable functions
- A truncated Newton algorithm for nonconvex sparse recovery
- A new spectral method for \(l_1\)-regularized minimization
- An accelerated randomized proximal coordinate gradient method and its application to regularized empirical risk minimization
- Low complexity regularization of linear inverse problems
- Activity identification and local linear convergence of forward-backward-type methods
- Sparse minimum discrepancy approach to sufficient dimension reduction with simultaneous variable selection in ultrahigh dimension
- Linear convergence of proximal gradient algorithm with extrapolation for a class of nonconvex nonsmooth minimization problems
- Further properties of the forward-backward envelope with applications to difference-of-convex programming
- A flexible coordinate descent method
- Non-concave penalization in linear mixed-effect models and regularized selection of fixed effects
- On the convergence of asynchronous parallel iteration with unbounded delays
- (Robust) edge-based semidefinite programming relaxation of sensor network localization
- Block-coordinate primal-dual method for nonsmooth minimization over linear constraints
- A First-Order Optimization Algorithm for Statistical Learning with Hierarchical Sparsity Structure
- Visualizing the effects of a changing distance on data using continuous embeddings
- A globally convergent algorithm for nonconvex optimization based on block coordinate update
- A block coordinate variable metric linesearch based proximal gradient method
- Linearity identification for general partial linear single-index models
- A regularized semi-smooth Newton method with projection steps for composite convex programs
- A stochastic semismooth Newton method for nonsmooth nonconvex optimization
- On the linear convergence of forward-backward splitting method. I: Convergence analysis
- Error bounds for non-polyhedral convex optimization and applications to linear convergence of FDM and PGM
- A unified approach to error bounds for structured convex optimization problems
- On proximal gradient method for the convex problems regularized with the group reproducing kernel norm
- Orbital minimization method with \(\ell^{1}\) regularization
- Nonmonotone gradient methods for vector optimization with a portfolio optimization application
- A block coordinate gradient descent method for regularized convex separable optimization and covariance selection
- Variable metric inexact line-search-based methods for nonsmooth optimization
- Distributed block-diagonal approximation methods for regularized empirical risk minimization
- Combining line search and trust-region methods forℓ1-minimization
- Gradient-based method with active set strategy for \(\ell _1\) optimization
- A block symmetric Gauss-Seidel decomposition theorem for convex composite quadratic programming and its applications
- RSG: Beating Subgradient Method without Smoothness and Strong Convexity
- A coordinate gradient descent method for nonsmooth nonseparable minimization
- On stochastic mirror-prox algorithms for stochastic Cartesian variational inequalities: randomized block coordinate and optimal averaging schemes
- Distributed block coordinate descent for minimizing partially separable functions
- An inexact Riemannian proximal gradient method
- An inexact PAM method for computing Wasserstein barycenter with unknown supports
- An alternating direction method of multipliers with the BFGS update for structured convex quadratic optimization
- Inexact variable metric stochastic block-coordinate descent for regularized optimization
- Block-coordinate and incremental aggregated proximal gradient methods for nonsmooth nonconvex problems
- Global complexity analysis of inexact successive quadratic approximation methods for regularized optimization under mild assumptions
- A coordinate descent method for total variation minimization
- Primal path algorithm for compositional data analysis
- Accelerating block coordinate descent methods with identification strategies
- Linear convergence of prox-SVRG method for separable non-smooth convex optimization problems under bounded metric subregularity
- On the convergence of the forward-backward splitting method with linesearches
- Fully asynchronous stochastic coordinate descent: a tight lower bound on the parallelism achieving linear speedup
- An inexact successive quadratic approximation method for a class of difference-of-convex optimization problems
- An efficient Peaceman-Rachford splitting method for constrained TGV-shearlet-based MRI reconstruction
- Kurdyka-Łojasiewicz property of zero-norm composite functions
- A coordinate descent homotopy method for linearly constrained nonsmooth convex minimization
- Kurdyka-Łojasiewicz exponent via inf-projection
- Toward optimal fingerprinting in detection and attribution of changes in climate extremes
- Linear convergence of proximal incremental aggregated gradient method for nonconvex nonsmooth minimization problems
- Beetle swarm optimization algorithm: Theory and application
- Variable projection methods for separable nonlinear inverse problems with general-form Tikhonov regularization
- Linear convergence of inexact descent method and inexact proximal gradient algorithms for lower-order regularization problems
- Block stochastic gradient iteration for convex and nonconvex optimization
- An accelerated coordinate gradient descent algorithm for non-separable composite optimization
- Second order semi-smooth proximal Newton methods in Hilbert spaces
- Dynamical modeling for non-Gaussian data with high-dimensional sparse ordinary differential equations
- Globalized inexact proximal Newton-type methods for nonconvex composite functions
- Iteration complexity of a block coordinate gradient descent method for convex optimization
- Bayesian adaptive lasso with variational Bayes for variable selection in high-dimensional generalized linear mixed models
- Block coordinate type methods for optimization and learning
- A Bregman forward-backward linesearch algorithm for nonconvex composite optimization: superlinear convergence to nonisolated local minima
- A proximal interior point algorithm with applications to image processing
- A block inertial Bregman proximal algorithm for nonsmooth nonconvex problems with application to symmetric nonnegative matrix tri-factorization
- Global convergence rate of proximal incremental aggregated gradient methods
- Block mirror stochastic gradient method for stochastic optimization
- Level-set subdifferential error bounds and linear convergence of Bregman proximal gradient method
- Perturbation techniques for convergence analysis of proximal gradient method and other first-order algorithms via variational analysis
Uses Software
This page was built for publication: A coordinate gradient descent method for nonsmooth separable minimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q959979)