A coordinate gradient descent method for nonsmooth separable minimization
DOI10.1007/S10107-007-0170-0zbMATH Open1166.90016OpenAlexW2039050532MaRDI QIDQ959979FDOQ959979
Publication date: 16 December 2008
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10107-007-0170-0
Recommendations
- A coordinate gradient descent method for nonsmooth nonseparable minimization
- A coordinate gradient descent method for \(\ell_{1}\)-regularized convex minimization
- On the iteration complexity of cyclic coordinate gradient descent methods
- Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization
- A block coordinate gradient descent method for regularized convex separable optimization and covariance selection
Numerical mathematical programming methods (65K05) Convex programming (90C25) Methods of successive quadratic programming type (90C55) Large-scale problems in mathematical programming (90C06) Nonconvex programming, global optimization (90C26) Nonlinear programming (90C30) Numerical methods based on nonlinear programming (49M37) Decomposition methods (49M27)
Cites Work
- Testing Unconstrained Optimization Software
- Algorithm 778: L-BFGS-B
- CUTEr and SifDec
- Numerical Optimization
- Ideal spatial adaptation by wavelet shrinkage
- Convergence of a block coordinate descent method for nondifferentiable minimization
- Adapting to Unknown Smoothness via Wavelet Shrinkage
- Atomic Decomposition by Basis Pursuit
- Variational Analysis
- Model Selection and Estimation in Regression with Grouped Variables
- The Group Lasso for Logistic Regression
- Updating Quasi-Newton Matrices with Limited Storage
- Title not available (Why is that?)
- Convex Analysis
- A coordinate gradient descent method for nonsmooth separable minimization
- Title not available (Why is that?)
- Title not available (Why is that?)
- A method for minimizing the sum of a convex function and a continuously differentiable function
- A minimization method for the sum of a convex function and a continuously differentiable function
- A successive quadratic programming method for a class of constrained nonsmooth optimization problems
- Linear convergence of epsilon-subgradient descent methods for a class of convex functions
- On the convergence of the block nonlinear Gauss-Seidel method under convex constraints
- Title not available (Why is that?)
- Some continuity properties of polyhedral multifunctions
- A generalized proximal point algorithm for certain non-convex minimization problems
- Title not available (Why is that?)
- Trust Region Methods
- Finite-Dimensional Variational Inequalities and Complementarity Problems
- Title not available (Why is that?)
- Title not available (Why is that?)
- On the Accurate Identification of Active Constraints
- Iterative Solution of Nonlinear Equations in Several Variables
- On search directions for minimization algorithms
- Error bounds and convergence analysis of feasible descent methods: A general approach
- On the Linear Convergence of Descent Methods for Convex Essentially Smooth Minimization
- Descent methods for composite nondifferentiable optimization problems
- Error Bound and Convergence Analysis of Matrix Splitting Algorithms for the Affine Variational Inequality Problem
- Mathematical Programming for Data Mining: Formulations and Challenges
- On the Solution of Large Quadratic Programming Problems with Bound Constraints
- Parallel Variable Transformation in Unconstrained Optimization
- Parallel Gradient Distribution in Unconstrained Optimization
- Parallel Variable Distribution
- On the Convergence Rate of Dual Ascent Methods for Linearly Constrained Convex Minimization
- Dual coordinate ascent methods for non-strictly convex minimization
- A model algorithm for composite nondifferentiable optimization problems
- Large scale kernel regression via linear programming
- Sparsity-preserving SOR algorithms for separable quadratic and linear programming
- On the Statistical Analysis of Smoothing by Maximizing Dirty Markov Random Field Posterior Distributions
- Parallel gradient projection successive overrelaxation for symmetric linear complementarity problems and linear programs
- On the Rate of Convergence of a Partially Asynchronous Gradient Projection Algorithm
Cited In (only showing first 100 items - show all)
- An inexact Riemannian proximal gradient method
- Dykstra's splitting and an approximate proximal point algorithm for minimizing the sum of convex functions
- Toward Optimal Fingerprinting in Detection and Attribution of Changes in Climate Extremes
- Block-coordinate and incremental aggregated proximal gradient methods for nonsmooth nonconvex problems
- Global Convergence Rate of Proximal Incremental Aggregated Gradient Methods
- A coordinate descent method for total variation minimization
- Primal path algorithm for compositional data analysis
- On the convergence of the forward–backward splitting method with linesearches
- Linear convergence of prox-SVRG method for separable non-smooth convex optimization problems under bounded metric subregularity
- Iteration Complexity of a Block Coordinate Gradient Descent Method for Convex Optimization
- Fully asynchronous stochastic coordinate descent: a tight lower bound on the parallelism achieving linear speedup
- An inexact successive quadratic approximation method for a class of difference-of-convex optimization problems
- A Bregman Forward-Backward Linesearch Algorithm for Nonconvex Composite Optimization: Superlinear Convergence to Nonisolated Local Minima
- Kurdyka-Łojasiewicz property of zero-norm composite functions
- A coordinate descent homotopy method for linearly constrained nonsmooth convex minimization
- Kurdyka-Łojasiewicz exponent via inf-projection
- Linear convergence of proximal incremental aggregated gradient method for nonconvex nonsmooth minimization problems
- Beetle swarm optimization algorithm: Theory and application
- Variable projection methods for separable nonlinear inverse problems with general-form Tikhonov regularization
- Linear convergence of inexact descent method and inexact proximal gradient algorithms for lower-order regularization problems
- CUSTOM: a calibration region recovery approach for highly subsampled dynamic parallel magnetic resonance imaging
- Second order semi-smooth proximal Newton methods in Hilbert spaces
- Dynamical modeling for non-Gaussian data with high-dimensional sparse ordinary differential equations
- Globalized inexact proximal Newton-type methods for nonconvex composite functions
- Bayesian adaptive lasso with variational Bayes for variable selection in high-dimensional generalized linear mixed models
- Block coordinate type methods for optimization and learning
- A proximal interior point algorithm with applications to image processing
- A block inertial Bregman proximal algorithm for nonsmooth nonconvex problems with application to symmetric nonnegative matrix tri-factorization
- Block mirror stochastic gradient method for stochastic optimization
- Level-set subdifferential error bounds and linear convergence of Bregman proximal gradient method
- Perturbation techniques for convergence analysis of proximal gradient method and other first-order algorithms via variational analysis
- Bregman Finito/MISO for Nonconvex Regularized Finite Sum Minimization without Lipschitz Gradient Continuity
- Block Stochastic Gradient Iteration for Convex and Nonconvex Optimization
- Nonconvex proximal incremental aggregated gradient method with linear convergence
- Randomized Block Proximal Damped Newton Method for Composite Self-Concordant Minimization
- A joint estimation approach to sparse additive ordinary differential equations
- Achieving the oracle property of OEM with nonconvex penalties
- The generalized equivalence of regularization and min-max robustification in linear mixed models
- Variational analysis perspective on linear convergence of some first order methods for nonsmooth convex optimization problems
- A preconditioned conjugate gradient method with active set strategy for \(\ell_1\)-regularized least squares
- An efficient Peaceman–Rachford splitting method for constrained TGV-shearlet-based MRI reconstruction
- Markov chain block coordinate descent
- A new convergence analysis for the Volterra series representation of nonlinear systems
- A fast conjugate gradient algorithm with active set prediction for ℓ1 optimization
- New convergence results for the inexact variable metric forward-backward method
- An active set Barzilar-Borwein algorithm for \(l_0\) regularized optimization
- Metric subregularity and/or calmness of the normal cone mapping to the \(p\)-order conic constraint system
- Convergence rate of block-coordinate maximization Burer-Monteiro method for solving large SDPs
- An elastic net penalized small area model combining unit- and area-level data for regional hypertension prevalence estimation
- Inertial alternating direction method of multipliers for non-convex non-smooth optimization
- Synchronous parallel block coordinate descent method for nonsmooth convex function minimization
- High-performance statistical computing in the computing environments of the 2020s
- Two fast vector-wise update algorithms for orthogonal nonnegative matrix factorization with sparsity constraint
- An inexact PAM method for computing Wasserstein barycenter with unknown supports
- An alternating direction method of multipliers with the BFGS update for structured convex quadratic optimization
- Inexact variable metric stochastic block-coordinate descent for regularized optimization
- Non-overlapping domain decomposition methods for dual total variation based image denoising
- Structured regularization for conditional Gaussian graphical models
- Blocks of coordinates, stochastic programming, and markets
- Stochastic block-coordinate gradient projection algorithms for submodular maximization
- Alternating direction method of multipliers with variable metric indefinite proximal terms for convex optimization
- Global complexity analysis of inexact successive quadratic approximation methods for regularized optimization under mild assumptions
- A parallel line search subspace correction method for composite convex optimization
- Asynchronous parallel primal-dual block coordinate update methods for affinely constrained convex programs
- Block decomposition methods for total variation by primal-dual stitching
- SOR- and Jacobi-type iterative methods for solving \(\ell_1 - \ell_2\) problems by way of Fenchel duality
- Calculus of the exponent of Kurdyka-Łojasiewicz inequality and its applications to linear convergence of first-order methods
- Title not available (Why is that?)
- Inexact successive quadratic approximation for regularized optimization
- Accelerating block coordinate descent methods with identification strategies
- Sequential threshold control in descent splitting methods for decomposable optimization problems
- A family of inexact SQA methods for non-smooth convex minimization with provable convergence guarantees based on the Luo-Tseng error bound property
- Extended ADMM and BCD for nonseparable convex minimization models with quadratic coupling terms: convergence analysis and insights
- Oracle inequalities for sparse additive quantile regression in reproducing kernel Hilbert space
- Nonparametric additive model with grouped Lasso and maximizing area under the ROC curve
- A fast active set block coordinate descent algorithm for \(\ell_1\)-regularized least squares
- A truncated Newton algorithm for nonconvex sparse recovery
- A Stochastic Semismooth Newton Method for Nonsmooth Nonconvex Optimization
- A new spectral method for \(l_1\)-regularized minimization
- Distributed Block Coordinate Descent for Minimizing Partially Separable Functions
- Gradient-based method with active set strategy for $\ell _1$ optimization
- Further properties of the forward-backward envelope with applications to difference-of-convex programming
- A flexible coordinate descent method
- Non-concave penalization in linear mixed-effect models and regularized selection of fixed effects
- On the convergence of asynchronous parallel iteration with unbounded delays
- A First-Order Optimization Algorithm for Statistical Learning with Hierarchical Sparsity Structure
- Low Complexity Regularization of Linear Inverse Problems
- Visualizing the effects of a changing distance on data using continuous embeddings
- Activity Identification and Local Linear Convergence of Forward--Backward-type Methods
- A globally convergent algorithm for nonconvex optimization based on block coordinate update
- A block coordinate variable metric linesearch based proximal gradient method
- Linearity identification for general partial linear single-index models
- A regularized semi-smooth Newton method with projection steps for composite convex programs
- Linear Convergence of Proximal Gradient Algorithm with Extrapolation for a Class of Nonconvex Nonsmooth Minimization Problems
- On the linear convergence of forward-backward splitting method. I: Convergence analysis
- An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization
- Error bounds for non-polyhedral convex optimization and applications to linear convergence of FDM and PGM
- Overlapping Domain Decomposition Methods for Total Variation Denoising
- On Faster Convergence of Cyclic Block Coordinate Descent-type Methods for Strongly Convex Minimization
- A Highly Efficient Semismooth Newton Augmented Lagrangian Method for Solving Lasso Problems
Uses Software
This page was built for publication: A coordinate gradient descent method for nonsmooth separable minimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q959979)