A coordinate gradient descent method for nonsmooth separable minimization

From MaRDI portal
Publication:959979

DOI10.1007/s10107-007-0170-0zbMath1166.90016OpenAlexW2039050532MaRDI QIDQ959979

Sangwoon Yun, Paul Tseng

Publication date: 16 December 2008

Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1007/s10107-007-0170-0



Related Items

An inexact successive quadratic approximation method for a class of difference-of-convex optimization problems, Sparse group Lasso and high dimensional multinomial classification, Alternating direction method of multipliers with variable metric indefinite proximal terms for convex optimization, Nonparametric additive model with grouped Lasso and maximizing area under the ROC curve, Block-coordinate and incremental aggregated proximal gradient methods for nonsmooth nonconvex problems, On some steplength approaches for proximal algorithms, A convergent decomposition method for box-constrained optimization problems, Inexact coordinate descent: complexity and preconditioning, Second order semi-smooth proximal Newton methods in Hilbert spaces, Dynamical modeling for non-Gaussian data with high-dimensional sparse ordinary differential equations, A truncated Newton algorithm for nonconvex sparse recovery, A flexible coordinate descent method, Split Bregman algorithms for multiple measurement vector problem, Block coordinate proximal gradient methods with variable Bregman functions for nonsmooth separable optimization, Practical inexact proximal quasi-Newton method with global complexity analysis, Kurdyka-Łojasiewicz exponent via inf-projection, Visualizing the effects of a changing distance on data using continuous embeddings, Inertial alternating direction method of multipliers for non-convex non-smooth optimization, A regularized semi-smooth Newton method with projection steps for composite convex programs, A joint estimation approach to sparse additive ordinary differential equations, A unified approach to error bounds for structured convex optimization problems, Approximation accuracy, gradient methods, and error bound for structured convex optimization, A globally convergent algorithm for nonconvex optimization based on block coordinate update, Orbital minimization method with \(\ell^{1}\) regularization, Block coordinate descent algorithms for large-scale sparse multiclass classification, \(\ell_{1}\)-penalization for mixture regression models, Primal path algorithm for compositional data analysis, Nonmonotone gradient methods for vector optimization with a portfolio optimization application, X-ray CT image reconstruction via wavelet frame based regularization and Radon domain inpainting, Global complexity analysis of inexact successive quadratic approximation methods for regularized optimization under mild assumptions, Synchronous parallel block coordinate descent method for nonsmooth convex function minimization, A proximal interior point algorithm with applications to image processing, Variable selection for sparse Dirichlet-multinomial regression with an application to microbiome data analysis, Majorization-minimization algorithms for nonsmoothly penalized objective functions, On stochastic mirror-prox algorithms for stochastic Cartesian variational inequalities: randomized block coordinate and optimal averaging schemes, A block coordinate gradient descent method for regularized convex separable optimization and covariance selection, Robust least square semidefinite programming with applications, On the linear convergence of a proximal gradient method for a class of nonsmooth convex minimization problems, Solution path clustering with adaptive concave penalty, Extended ADMM and BCD for nonseparable convex minimization models with quadratic coupling terms: convergence analysis and insights, A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints, The 2-coordinate descent method for solving double-sided simplex constrained minimization problems, (Robust) edge-based semidefinite programming relaxation of sensor network localization, Blocks of coordinates, stochastic programming, and markets, Stochastic block-coordinate gradient projection algorithms for submodular maximization, Nonmonotone Barzilai-Borwein gradient algorithm for \(\ell_1\)-regularized nonsmooth minimization in compressive sensing, Weighted-average alternating minimization method for magnetic resonance image reconstruction based on compressive sensing, A variable fixing version of the two-block nonlinear constrained Gauss-Seidel algorithm for \(\ell_1\)-regularized least-squares, Efficient random coordinate descent algorithms for large-scale structured nonconvex optimization, Asynchronous parallel primal-dual block coordinate update methods for affinely constrained convex programs, On the linear convergence of the approximate proximal splitting method for non-smooth convex optimization, A new generalized shrinkage conjugate gradient method for sparse recovery, On the complexity analysis of randomized block-coordinate descent methods, A family of inexact SQA methods for non-smooth convex minimization with provable convergence guarantees based on the Luo-Tseng error bound property, Fully asynchronous stochastic coordinate descent: a tight lower bound on the parallelism achieving linear speedup, An inexact PAM method for computing Wasserstein barycenter with unknown supports, An alternating direction method of multipliers with the BFGS update for structured convex quadratic optimization, Inexact variable metric stochastic block-coordinate descent for regularized optimization, Incomplete variables truncated conjugate gradient method for signal reconstruction in compressed sensing, GAITA: a Gauss-Seidel iterative thresholding algorithm for \(\ell_q\) regularized least squares regression, Oracle inequalities for sparse additive quantile regression in reproducing kernel Hilbert space, A coordinate gradient descent method for nonsmooth separable minimization, Incrementally updated gradient methods for constrained and regularized optimization, Iteration complexity analysis of block coordinate descent methods, SOR- and Jacobi-type iterative methods for solving \(\ell_1 - \ell_2\) problems by way of Fenchel duality, A new spectral method for \(l_1\)-regularized minimization, A coordinate gradient descent method for \(\ell_{1}\)-regularized convex minimization, A proximal gradient descent method for the extended second-order cone linear complementarity problem, A coordinate gradient descent method for linearly constrained smooth optimization and support vector machines training, An iterative approach for cone complementarity problems for nonsmooth dynamics, An augmented Lagrangian approach for sparse principal component analysis, Multi-objective isogeometric integrated optimization for shape control of piezoelectric functionally graded plates, Linear convergence of inexact descent method and inexact proximal gradient algorithms for lower-order regularization problems, Nonconvex proximal incremental aggregated gradient method with linear convergence, Kurdyka-Łojasiewicz property of zero-norm composite functions, Calculus of the exponent of Kurdyka-Łojasiewicz inequality and its applications to linear convergence of first-order methods, Error bounds for non-polyhedral convex optimization and applications to linear convergence of FDM and PGM, Globalized inexact proximal Newton-type methods for nonconvex composite functions, A Stochastic Quasi-Newton Method for Large-Scale Optimization, On the linear convergence of forward-backward splitting method. I: Convergence analysis, A block coordinate variable metric linesearch based proximal gradient method, Linearity identification for general partial linear single-index models, A new convergence analysis for the Volterra series representation of nonlinear systems, Penalized Estimation of Directed Acyclic Graphs From Discrete Data, Main effects and interactions in mixed and incomplete data frames, Spatial Variable Selection and An Application to Virginia Lyme Disease Emergence, Multi-block Bregman proximal alternating linearized minimization and its application to orthogonal nonnegative matrix factorization, Level-set subdifferential error bounds and linear convergence of Bregman proximal gradient method, A block inertial Bregman proximal algorithm for nonsmooth nonconvex problems with application to symmetric nonnegative matrix tri-factorization, Markov chain block coordinate descent, Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization, The generalized equivalence of regularization and min-max robustification in linear mixed models, Variational analysis perspective on linear convergence of some first order methods for nonsmooth convex optimization problems, Distributed block-diagonal approximation methods for regularized empirical risk minimization, Convergence rate of block-coordinate maximization Burer-Monteiro method for solving large SDPs, High-performance statistical computing in the computing environments of the 2020s, Linear convergence of prox-SVRG method for separable non-smooth convex optimization problems under bounded metric subregularity, Nomonotone spectral gradient method for sparse recovery, Perturbation techniques for convergence analysis of proximal gradient method and other first-order algorithms via variational analysis, A second-order method for strongly convex \(\ell _1\)-regularization problems, An inexact quasi-Newton algorithm for large-scale \(\ell_1\) optimization with box constraints, Model-Based Clustering of High-Dimensional Longitudinal Data via Regularization, Doubly iteratively reweighted algorithm for constrained compressed sensing models, Variable projection methods for separable nonlinear inverse problems with general-form Tikhonov regularization, On the Global Convergence of Randomized Coordinate Gradient Descent for Nonconvex Optimization, Block mirror stochastic gradient method for stochastic optimization, Accelerating inexact successive quadratic approximation for regularized optimization through manifold identification, An approximate Newton-type proximal method using symmetric rank-one updating formula for minimizing the nonsmooth composite functions, A diagonally scaled Newton-type proximal method for minimization of the models with nonsmooth composite cost functions, Convergence Rate Analysis of a Dykstra-Type Projection Algorithm, Achieving the oracle property of OEM with nonconvex penalties, CUSTOM: a calibration region recovery approach for highly subsampled dynamic parallel magnetic resonance imaging, Structured regularization for conditional Gaussian graphical models, Robust sparse regression by modeling noise as a mixture of gaussians, Block Bregman Majorization Minimization with Extrapolation, Effects of depth, width, and initialization: A convergence analysis of layer-wise training for deep linear neural networks, Further properties of the forward-backward envelope with applications to difference-of-convex programming, Block coordinate type methods for optimization and learning, A robust and efficient variable selection method for linear regression, Performance of first- and second-order methods for \(\ell_1\)-regularized least squares problems, An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization, Block decomposition methods for total variation by primal-dual stitching, On the convergence of an active-set method for ℓ1minimization, A proximal block minimization method of multipliers with a substitution procedure, Sequential threshold control in descent splitting methods for decomposable optimization problems, Estimation for High-Dimensional Linear Mixed-Effects Models Using ℓ1-Penalization, Distributed Block Coordinate Descent for Minimizing Partially Separable Functions, Parallel Random Coordinate Descent Method for Composite Minimization: Convergence Analysis and Error Bounds, Efficient block-coordinate descent algorithms for the group Lasso, A smoothing stochastic gradient method for composite optimization, Inversion of electromagnetic geosoundings using coordinate descent optimization, Beetle swarm optimization algorithm: Theory and application, An active set Newton-CG method for \(\ell_1\) optimization, A modified multinomial baseline logit model with logit functions having different covariates, Robust sparse Gaussian graphical modeling, A First-Order Optimization Algorithm for Statistical Learning with Hierarchical Sparsity Structure, Bayesian adaptive lasso with variational Bayes for variable selection in high-dimensional generalized linear mixed models, New convergence results for the inexact variable metric forward-backward method, Block Stochastic Gradient Iteration for Convex and Nonconvex Optimization, Quadratic growth conditions and uniqueness of optimal solution to Lasso, A block symmetric Gauss-Seidel decomposition theorem for convex composite quadratic programming and its applications, Accelerating block coordinate descent methods with identification strategies, Inexact successive quadratic approximation for regularized optimization, General inertial proximal gradient method for a class of nonconvex nonsmooth optimization problems, Asynchronous Stochastic Coordinate Descent: Parallelism and Convergence Properties, Stochastic Block Mirror Descent Methods for Nonsmooth and Stochastic Optimization, A preconditioned conjugate gradient method with active set strategy for \(\ell_1\)-regularized least squares, On the convergence of asynchronous parallel iteration with unbounded delays, Randomized Block Proximal Damped Newton Method for Composite Self-Concordant Minimization, Linear Convergence of Proximal Gradient Algorithm with Extrapolation for a Class of Nonconvex Nonsmooth Minimization Problems, A Randomized Nonmonotone Block Proximal Gradient Method for a Class of Structured Nonlinear Programming, The Variable Metric Forward-Backward Splitting Algorithm Under Mild Differentiability Assumptions, Activity Identification and Local Linear Convergence of Forward--Backward-type Methods, Gradient-based method with active set strategy for $\ell _1$ optimization, A Highly Efficient Semismooth Newton Augmented Lagrangian Method for Solving Lasso Problems, Error bound and isocost imply linear convergence of DCA-based algorithms to D-stationarity, Hybrid Jacobian and Gauss--Seidel Proximal Block Coordinate Update Methods for Linearly Constrained Convex Programming, An inexact Riemannian proximal gradient method, On proximal gradient method for the convex problems regularized with the group reproducing kernel norm, Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function, Unnamed Item, On the complexity of parallel coordinate descent, Global Convergence Rate of Proximal Incremental Aggregated Gradient Methods, Optimization Methods for Large-Scale Machine Learning, A Coordinate-Descent Primal-Dual Algorithm with Large Step Size and Possibly Nonseparable Functions, Unnamed Item, Robust Gaussian Graphical Modeling Via l1 Penalization, RSG: Beating Subgradient Method without Smoothness and Strong Convexity, On Faster Convergence of Cyclic Block Coordinate Descent-type Methods for Strongly Convex Minimization, Robust Variable Selection With Exponential Squared Loss, A Fast Active Set Block Coordinate Descent Algorithm for $\ell_1$-Regularized Least Squares, Low Complexity Regularization of Linear Inverse Problems, Variable Metric Inexact Line-Search-Based Methods for Nonsmooth Optimization, Block Coordinate Descent Methods for Semidefinite Programming, Non-overlapping domain decomposition methods for dual total variation based image denoising, A parallel line search subspace correction method for composite convex optimization, An Efficient Inexact ABCD Method for Least Squares Semidefinite Programming, A coordinate descent homotopy method for linearly constrained nonsmooth convex minimization, A second-order method for convex1-regularized optimization with active-set prediction, The Group Lasso for Logistic Regression, Projection onto a Polyhedron that Exploits Sparsity, A Block Successive Upper-Bound Minimization Method of Multipliers for Linearly Constrained Convex Optimization, An active set Barzilar-Borwein algorithm for \(l_0\) regularized optimization, Two fast vector-wise update algorithms for orthogonal nonnegative matrix factorization with sparsity constraint, Metric subregularity and/or calmness of the normal cone mapping to the \(p\)-order conic constraint system, On the convergence of the forward–backward splitting method with linesearches, A Multilevel Framework for Sparse Optimization with Application to Inverse Covariance Estimation and Logistic Regression, Overlapping Domain Decomposition Methods for Total Variation Denoising, Non-concave penalization in linear mixed-effect models and regularized selection of fixed effects, An efficient Peaceman–Rachford splitting method for constrained TGV-shearlet-based MRI reconstruction, A fast conjugate gradient algorithm with active set prediction for ℓ1 optimization, Dykstra's splitting and an approximate proximal point algorithm for minimizing the sum of convex functions, Sparse Minimum Discrepancy Approach to Sufficient Dimension Reduction with Simultaneous Variable Selection in Ultrahigh Dimension, Computing the Best Approximation over the Intersection of a Polyhedral Set and the Doubly Nonnegative Cone, A Stochastic Semismooth Newton Method for Nonsmooth Nonconvex Optimization, Iteration Complexity of a Block Coordinate Gradient Descent Method for Convex Optimization, A Bregman Forward-Backward Linesearch Algorithm for Nonconvex Composite Optimization: Superlinear Convergence to Nonisolated Local Minima, Toward Optimal Fingerprinting in Detection and Attribution of Changes in Climate Extremes, Fused Multiple Graphical Lasso, Unnamed Item, Unnamed Item, An elastic net penalized small area model combining unit- and area-level data for regional hypertension prevalence estimation, The Penalized Analytic Center Estimator, A generic coordinate descent solver for non-smooth convex optimisation, Linear convergence of proximal incremental aggregated gradient method for nonconvex nonsmooth minimization problems, Coordinate descent algorithms, One-Step Estimation with Scaled Proximal Methods, Bregman Finito/MISO for Nonconvex Regularized Finite Sum Minimization without Lipschitz Gradient Continuity, Combining line search and trust-region methods forℓ1-minimization, Distributionally Robust Inverse Covariance Estimation: The Wasserstein Shrinkage Estimator


Uses Software


Cites Work