Pages that link to "Item:Q1675267"
From MaRDI portal
The following pages link to A unified approach to error bounds for structured convex optimization problems (Q1675267):
Displaying 50 items.
- Convergence rates of subgradient methods for quasi-convex optimization problems (Q782917) (← links)
- A family of inexact SQA methods for non-smooth convex minimization with provable convergence guarantees based on the Luo-Tseng error bound property (Q1739040) (← links)
- Calculus of the exponent of Kurdyka-Łojasiewicz inequality and its applications to linear convergence of first-order methods (Q1785009) (← links)
- Double fused Lasso penalized LAD for matrix regression (Q2009580) (← links)
- New characterizations of Hoffman constants for systems of linear constraints (Q2020601) (← links)
- Kurdyka-Łojasiewicz property of zero-norm composite functions (Q2026719) (← links)
- On the linear convergence of forward-backward splitting method. I: Convergence analysis (Q2031953) (← links)
- Double fused Lasso regularized regression with both matrix and vector valued predictors (Q2044365) (← links)
- Variational analysis perspective on linear convergence of some first order methods for nonsmooth convex optimization problems (Q2070400) (← links)
- Linear convergence of prox-SVRG method for separable non-smooth convex optimization problems under bounded metric subregularity (Q2115253) (← links)
- Perturbation techniques for convergence analysis of proximal gradient method and other first-order algorithms via variational analysis (Q2116020) (← links)
- Interior quasi-subgradient method with non-Euclidean distances for constrained quasi-convex optimization problems in Hilbert spaces (Q2141725) (← links)
- Augmented Lagrangian methods for convex matrix optimization problems (Q2158112) (← links)
- Kurdyka-Łojasiewicz exponent via inf-projection (Q2162122) (← links)
- Error bounds for non-polyhedral convex optimization and applications to linear convergence of FDM and PGM (Q2279378) (← links)
- New analysis of linear convergence of gradient-type methods via unifying error bound conditions (Q2297652) (← links)
- Faster subgradient methods for functions with Hölderian growth (Q2297653) (← links)
- Metric subregularity and/or calmness of the normal cone mapping to the \(p\)-order conic constraint system (Q2311202) (← links)
- Convergence rates of forward-Douglas-Rachford splitting method (Q2317846) (← links)
- Quadratic optimization with orthogonality constraint: explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods (Q2330648) (← links)
- On the R-superlinear convergence of the KKT residuals generated by the augmented Lagrangian method for convex composite conic programming (Q2330654) (← links)
- Linear convergence of first order methods for non-strongly convex optimization (Q2414900) (← links)
- A block symmetric Gauss-Seidel decomposition theorem for convex composite quadratic programming and its applications (Q2414911) (← links)
- Quadratic growth conditions and uniqueness of optimal solution to Lasso (Q2671439) (← links)
- Convergence of the forward-backward algorithm: beyond the worst-case with the help of geometry (Q2687067) (← links)
- Retraction-based first-order feasible methods for difference-of-convex programs with smooth inequality and simple geometric constraints (Q2692792) (← links)
- Error bound and isocost imply linear convergence of DCA-based algorithms to D-stationarity (Q2697002) (← links)
- An efficient augmented Lagrangian method with semismooth Newton solver for total generalized variation (Q2697370) (← links)
- An inexact Riemannian proximal gradient method (Q2701414) (← links)
- Characterization of the Robust Isolated Calmness for a Class of Conic Programming Problems (Q2957977) (← links)
- RSG: Beating Subgradient Method without Smoothness and Strong Convexity (Q4558142) (← links)
- Quadratic Growth Conditions for Convex Matrix Optimization Problems Associated with Spectral Functions (Q4588864) (← links)
- On the Estimation Performance and Convergence Rate of the Generalized Power Method for Phase Synchronization (Q4602339) (← links)
- A Highly Efficient Semismooth Newton Augmented Lagrangian Method for Solving Lasso Problems (Q4606653) (← links)
- On the Quadratic Convergence of the Cubic Regularization Method under a Local Error Bound Condition (Q4634142) (← links)
- A First-Order Optimization Algorithm for Statistical Learning with Hierarchical Sparsity Structure (Q5086011) (← links)
- Extended Newton-type method for inverse singular value problems with multiple and/or zero singular values (Q5123696) (← links)
- Preconditioned proximal point methods and notions of partial subregularity (Q5144479) (← links)
- On the Simplicity and Conditioning of Low Rank Semidefinite Programs (Q5162653) (← links)
- Spectral Operators of Matrices: Semismoothness and Characterizations of the Generalized Jacobian (Q5217598) (← links)
- Novel Reformulations and Efficient Algorithms for the Generalized Trust Region Subproblem (Q5231678) (← links)
- Accelerate stochastic subgradient method by leveraging local growth condition (Q5236746) (← links)
- On Degenerate Doubly Nonnegative Projection Problems (Q5868954) (← links)
- Hölderian Error Bounds and Kurdyka-Łojasiewicz Inequality for the Trust Region Subproblem (Q5870365) (← links)
- A Unified Analysis of Descent Sequences in Weakly Convex Optimization, Including Convergence Rates for Bundle Methods (Q5883316) (← links)
- Linear convergence of Frank-Wolfe for rank-one matrix recovery without strong convexity (Q6038639) (← links)
- Linearly-convergent FISTA variant for composite optimization with duality (Q6101606) (← links)
- A unified approach to synchronization problems over subgroups of the orthogonal group (Q6117023) (← links)
- An extended Ulm-like method for inverse singular value problems with multiple and/or zero singular values (Q6132998) (← links)
- Linear Convergence of a Proximal Alternating Minimization Method with Extrapolation for \(\boldsymbol{\ell_1}\) -Norm Principal Component Analysis (Q6158000) (← links)