A coordinate gradient descent method for nonsmooth separable minimization
From MaRDI portal
Publication:959979
Numerical mathematical programming methods (65K05) Convex programming (90C25) Methods of successive quadratic programming type (90C55) Large-scale problems in mathematical programming (90C06) Nonconvex programming, global optimization (90C26) Nonlinear programming (90C30) Numerical methods based on nonlinear programming (49M37) Decomposition methods (49M27)
Recommendations
- A coordinate gradient descent method for nonsmooth nonseparable minimization
- A coordinate gradient descent method for \(\ell_{1}\)-regularized convex minimization
- On the iteration complexity of cyclic coordinate gradient descent methods
- Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization
- A block coordinate gradient descent method for regularized convex separable optimization and covariance selection
Cites work
- scientific article; zbMATH DE number 1818892 (Why is no real title available?)
- scientific article; zbMATH DE number 3914081 (Why is no real title available?)
- scientific article; zbMATH DE number 3722433 (Why is no real title available?)
- scientific article; zbMATH DE number 1243473 (Why is no real title available?)
- scientific article; zbMATH DE number 1274356 (Why is no real title available?)
- scientific article; zbMATH DE number 1382772 (Why is no real title available?)
- scientific article; zbMATH DE number 778130 (Why is no real title available?)
- A coordinate gradient descent method for nonsmooth separable minimization
- A generalized proximal point algorithm for certain non-convex minimization problems
- A method for minimizing the sum of a convex function and a continuously differentiable function
- A minimization method for the sum of a convex function and a continuously differentiable function
- A model algorithm for composite nondifferentiable optimization problems
- A successive quadratic programming method for a class of constrained nonsmooth optimization problems
- Adapting to Unknown Smoothness via Wavelet Shrinkage
- Algorithm 778: L-BFGS-B
- Atomic Decomposition by Basis Pursuit
- CUTEr and SifDec
- Convergence of a block coordinate descent method for nondifferentiable minimization
- Convex Analysis
- Descent methods for composite nondifferentiable optimization problems
- Dual coordinate ascent methods for non-strictly convex minimization
- Error Bound and Convergence Analysis of Matrix Splitting Algorithms for the Affine Variational Inequality Problem
- Error bounds and convergence analysis of feasible descent methods: A general approach
- Finite-Dimensional Variational Inequalities and Complementarity Problems
- Ideal spatial adaptation by wavelet shrinkage
- Iterative Solution of Nonlinear Equations in Several Variables
- Large scale kernel regression via linear programming
- Linear convergence of epsilon-subgradient descent methods for a class of convex functions
- Mathematical Programming for Data Mining: Formulations and Challenges
- Model Selection and Estimation in Regression with Grouped Variables
- Numerical Optimization
- On search directions for minimization algorithms
- On the Accurate Identification of Active Constraints
- On the Convergence Rate of Dual Ascent Methods for Linearly Constrained Convex Minimization
- On the Linear Convergence of Descent Methods for Convex Essentially Smooth Minimization
- On the Rate of Convergence of a Partially Asynchronous Gradient Projection Algorithm
- On the Solution of Large Quadratic Programming Problems with Bound Constraints
- On the Statistical Analysis of Smoothing by Maximizing Dirty Markov Random Field Posterior Distributions
- On the convergence of the block nonlinear Gauss-Seidel method under convex constraints
- Parallel Gradient Distribution in Unconstrained Optimization
- Parallel Variable Distribution
- Parallel Variable Transformation in Unconstrained Optimization
- Parallel gradient projection successive overrelaxation for symmetric linear complementarity problems and linear programs
- Some continuity properties of polyhedral multifunctions
- Sparsity-preserving SOR algorithms for separable quadratic and linear programming
- Testing Unconstrained Optimization Software
- The Group Lasso for Logistic Regression
- Trust Region Methods
- Updating Quasi-Newton Matrices with Limited Storage
- Variational Analysis
Cited in
(only showing first 100 items - show all)- Inexact successive quadratic approximation for regularized optimization
- A block coordinate gradient descent method for regularized convex separable optimization and covariance selection
- The cyclic block conditional gradient method for convex optimization problems
- Linear convergence of proximal gradient algorithm with extrapolation for a class of nonconvex nonsmooth minimization problems
- Asynchronous parallel primal-dual block coordinate update methods for affinely constrained convex programs
- Visualizing the effects of a changing distance on data using continuous embeddings
- Block decomposition methods for total variation by primal-dual stitching
- Alternating direction method of multipliers with variable metric indefinite proximal terms for convex optimization
- An accelerated randomized proximal coordinate gradient method and its application to regularized empirical risk minimization
- Variable metric inexact line-search-based methods for nonsmooth optimization
- RSG: Beating Subgradient Method without Smoothness and Strong Convexity
- A highly efficient semismooth Newton augmented Lagrangian method for solving lasso problems
- Hybrid Jacobian and Gauss-Seidel proximal block coordinate update methods for linearly constrained convex programming
- Combining line search and trust-region methods forℓ1-minimization
- Oracle inequalities for sparse additive quantile regression in reproducing kernel Hilbert space
- Structured regularization for conditional Gaussian graphical models
- A First-Order Optimization Algorithm for Statistical Learning with Hierarchical Sparsity Structure
- Optimization methods for large-scale machine learning
- Distributed block-diagonal approximation methods for regularized empirical risk minimization
- Sparse minimum discrepancy approach to sufficient dimension reduction with simultaneous variable selection in ultrahigh dimension
- Blocks of coordinates, stochastic programming, and markets
- Block-coordinate primal-dual method for nonsmooth minimization over linear constraints
- Stochastic block-coordinate gradient projection algorithms for submodular maximization
- On faster convergence of cyclic block coordinate descent-type methods for strongly convex minimization
- Sequential threshold control in descent splitting methods for decomposable optimization problems
- SOR- and Jacobi-type iterative methods for solving \(\ell_1 - \ell_2\) problems by way of Fenchel duality
- A coordinate-descent primal-dual algorithm with large step size and possibly nonseparable functions
- Error bounds for non-polyhedral convex optimization and applications to linear convergence of FDM and PGM
- Distributionally robust inverse covariance estimation: the Wasserstein shrinkage estimator
- A unified approach to error bounds for structured convex optimization problems
- Nonparametric additive model with grouped Lasso and maximizing area under the ROC curve
- A fast active set block coordinate descent algorithm for \(\ell_1\)-regularized least squares
- Calculus of the exponent of Kurdyka-Łojasiewicz inequality and its applications to linear convergence of first-order methods
- Orbital minimization method with \(\ell^{1}\) regularization
- A truncated Newton algorithm for nonconvex sparse recovery
- A globally convergent algorithm for nonconvex optimization based on block coordinate update
- A parallel line search subspace correction method for composite convex optimization
- Non-concave penalization in linear mixed-effect models and regularized selection of fixed effects
- Gradient-based method with active set strategy for \(\ell _1\) optimization
- On stochastic mirror-prox algorithms for stochastic Cartesian variational inequalities: randomized block coordinate and optimal averaging schemes
- Nonmonotone gradient methods for vector optimization with a portfolio optimization application
- scientific article; zbMATH DE number 4078635 (Why is no real title available?)
- A coordinate gradient descent method for nonsmooth nonseparable minimization
- Distributed block coordinate descent for minimizing partially separable functions
- A stochastic semismooth Newton method for nonsmooth nonconvex optimization
- A block-coordinate descent method for linearly constrained minimization problem
- Low complexity regularization of linear inverse problems
- Further properties of the forward-backward envelope with applications to difference-of-convex programming
- A block coordinate variable metric linesearch based proximal gradient method
- A family of inexact SQA methods for non-smooth convex minimization with provable convergence guarantees based on the Luo-Tseng error bound property
- Linearity identification for general partial linear single-index models
- On the iteration complexity of cyclic coordinate gradient descent methods
- On the linear convergence of forward-backward splitting method. I: Convergence analysis
- On proximal gradient method for the convex problems regularized with the group reproducing kernel norm
- A block symmetric Gauss-Seidel decomposition theorem for convex composite quadratic programming and its applications
- On the convergence of asynchronous parallel iteration with unbounded delays
- A flexible coordinate descent method
- A new spectral method for \(l_1\)-regularized minimization
- A regularized semi-smooth Newton method with projection steps for composite convex programs
- (Robust) edge-based semidefinite programming relaxation of sensor network localization
- Activity identification and local linear convergence of forward-backward-type methods
- Non-overlapping domain decomposition methods for dual total variation based image denoising
- Extended ADMM and BCD for nonseparable convex minimization models with quadratic coupling terms: convergence analysis and insights
- Linear convergence of proximal incremental aggregated gradient method for nonconvex nonsmooth minimization problems
- High-performance statistical computing in the computing environments of the 2020s
- A fast conjugate gradient algorithm with active set prediction for ℓ1 optimization
- Block coordinate type methods for optimization and learning
- Markov chain block coordinate descent
- Fully asynchronous stochastic coordinate descent: a tight lower bound on the parallelism achieving linear speedup
- An inexact successive quadratic approximation method for a class of difference-of-convex optimization problems
- The generalized equivalence of regularization and min-max robustification in linear mixed models
- An accelerated coordinate gradient descent algorithm for non-separable composite optimization
- Second order semi-smooth proximal Newton methods in Hilbert spaces
- Dynamical modeling for non-Gaussian data with high-dimensional sparse ordinary differential equations
- Kurdyka-Łojasiewicz property of zero-norm composite functions
- A block inertial Bregman proximal algorithm for nonsmooth nonconvex problems with application to symmetric nonnegative matrix tri-factorization
- Linear convergence of prox-SVRG method for separable non-smooth convex optimization problems under bounded metric subregularity
- Accelerating block coordinate descent methods with identification strategies
- Global complexity analysis of inexact successive quadratic approximation methods for regularized optimization under mild assumptions
- Two fast vector-wise update algorithms for orthogonal nonnegative matrix factorization with sparsity constraint
- A preconditioned conjugate gradient method with active set strategy for \(\ell_1\)-regularized least squares
- An inexact Riemannian proximal gradient method
- Bregman Finito/MISO for nonconvex regularized finite sum minimization without Lipschitz gradient continuity
- Convergence rate of block-coordinate maximization Burer-Monteiro method for solving large SDPs
- A coordinate descent method for total variation minimization
- A Bregman forward-backward linesearch algorithm for nonconvex composite optimization: superlinear convergence to nonisolated local minima
- Globalized inexact proximal Newton-type methods for nonconvex composite functions
- An inexact PAM method for computing Wasserstein barycenter with unknown supports
- An alternating direction method of multipliers with the BFGS update for structured convex quadratic optimization
- An elastic net penalized small area model combining unit- and area-level data for regional hypertension prevalence estimation
- Inexact variable metric stochastic block-coordinate descent for regularized optimization
- Linear convergence of inexact descent method and inexact proximal gradient algorithms for lower-order regularization problems
- New convergence results for the inexact variable metric forward-backward method
- Primal path algorithm for compositional data analysis
- A new convergence analysis for the Volterra series representation of nonlinear systems
- Perturbation techniques for convergence analysis of proximal gradient method and other first-order algorithms via variational analysis
- A coordinate descent homotopy method for linearly constrained nonsmooth convex minimization
- Iteration complexity of a block coordinate gradient descent method for convex optimization
- Randomized block proximal damped Newton method for composite self-concordant minimization
- A joint estimation approach to sparse additive ordinary differential equations
This page was built for publication: A coordinate gradient descent method for nonsmooth separable minimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q959979)