Pages that link to "Item:Q1667162"
From MaRDI portal
The following pages link to A fast dual proximal gradient algorithm for convex minimization and applications (Q1667162):
Displayed 29 items.
- Acceleration of the PDHGM on partially strongly convex functions (Q1703999) (← links)
- Rate of convergence analysis of dual-based variables decomposition methods for strongly convex problems (Q1785469) (← links)
- The Douglas-Rachford algorithm in the affine-convex case (Q1790199) (← links)
- Proximal algorithms in statistics and machine learning (Q1790304) (← links)
- The Glowinski-Le Tallec splitting method revisited: a general convergence and convergence rate analysis (Q1983721) (← links)
- Finding best approximation pairs for two intersections of closed convex sets (Q2070342) (← links)
- A dual Bregman proximal gradient method for relatively-strongly convex optimization (Q2092292) (← links)
- A general double-proximal gradient algorithm for d.c. programming (Q2330650) (← links)
- On convergence analysis of dual proximal-gradient methods with approximate gradient for a class of nonsmooth convex minimization problems (Q2515292) (← links)
- Iteration complexity analysis of dual first-order methods for conic convex programming (Q2815553) (← links)
- An Alternating Semiproximal Method for Nonconvex Regularized Structured Total Least Squares Problems (Q2818270) (← links)
- Decomposition Methods for Sparse Matrix Nearness Problems (Q3456880) (← links)
- A Smooth Primal-Dual Optimization Framework for Nonsmooth Composite Convex Minimization (Q4600841) (← links)
- Another Look at the Fast Iterative Shrinkage/Thresholding Algorithm (FISTA) (Q4603039) (← links)
- CGIHT: conjugate gradient iterative hard thresholding for compressed sensing and matrix completion (Q4603591) (← links)
- FOM – a MATLAB toolbox of first-order methods for solving convex optimization problems (Q4646679) (← links)
- On the Complexity Analysis of the Primal Solutions for the Accelerated Randomized Dual Coordinate Ascent (Q4969070) (← links)
- Proximal Gradient Methods for Machine Learning and Imaging (Q5028165) (← links)
- Primal–dual accelerated gradient methods with small-dimensional relaxation oracle (Q5085262) (← links)
- Accelerated Iterative Regularization via Dual Diagonal Descent (Q5853571) (← links)
- Dual Space Preconditioning for Gradient Descent (Q5857297) (← links)
- Stochastic proximal linear method for structured non-convex problems (Q5858986) (← links)
- A dual approach for optimal algorithms in distributed optimization over networks (Q5859014) (← links)
- Implicit regularization with strongly convex bias: Stability and acceleration (Q5873931) (← links)
- Convergence Analysis of the Proximal Gradient Method in the Presence of the Kurdyka–Łojasiewicz Property Without Global Lipschitz Assumptions (Q6071886) (← links)
- Composite optimization with coupling constraints via dual proximal gradient method with applications to asynchronous networks (Q6085458) (← links)
- Deautoconvolution in the two-dimensional case (Q6163341) (← links)
- Dual gradient method for ill-posed problems using multiple repeated measurement data (Q6165999) (← links)
- A Newton-type proximal gradient method for nonlinear multi-objective optimization problems (Q6175566) (← links)