Accelerated Iterative Regularization via Dual Diagonal Descent
DOI10.1137/19M1308888zbMath1461.90093arXiv1912.12153OpenAlexW2997044101MaRDI QIDQ5853571
Lorenzo Rosasco, Luca Calatroni, Guillaume Garrigos, Silvia Villa
Publication date: 10 March 2021
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1912.12153
dualityaccelerationiterative regularizationforward-backward splittingstability and convergence analysisdiagonal methods
Convex programming (90C25) Computing methodologies for image processing (68U10) Linear programming (90C05) Duality theory (optimization) (49N15) Inverse problems in optimal control (49N45)
Related Items (2)
Uses Software
Cites Work
- Nonlinear total variation based noise removal algorithms
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- First-order methods of smooth convex optimization with inexact oracle
- On the convergence of the iterates of the ``fast iterative shrinkage/thresholding algorithm
- Asymptotic behavior of gradient-like dynamical systems involving inertia and multiscale aspects
- Asymptotics for some vibro-impact problems with a linear dissipation term
- Iterative regularization methods for nonlinear ill-posed problems
- Error estimation for Bregman iterations and inverse scale space methods in image restoration
- Estimation of the mean of a multivariate normal distribution
- Convergence of diagonally stationary sequences in convex optimization
- Image recovery via total variation minimization and related problems
- Introductory lectures on convex optimization. A basic course.
- A fast dual proximal gradient algorithm for convex minimization and applications
- Iterative regularization via dual diagonal descent
- Inertial forward-backward algorithms with perturbations: application to Tikhonov regularization
- Level-set methods for convex optimization
- Mirror descent and nonlinear projected subgradient methods for convex optimization.
- A dynamical approach to convex minimization coupling approximation with the steepest descent method
- Preconditioned iterative regularization in Banach spaces
- Convergence rate of inertial forward-backward algorithm beyond Nesterov's rule
- Asymptotic behavior of coupled dynamical systems with multiscale aspects
- On Nesterov acceleration for Landweber iteration of linear ill-posed problems
- Combining fast inertial dynamics for convex optimization with Tikhonov regularization
- Fast convergence of inertial dynamics and algorithms with asymptotic vanishing viscosity
- Error estimates for general fidelities
- The Rate of Convergence of Nesterov's Accelerated Forward-Backward Method is Actually Faster Than $1/k^2$
- A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights
- Accelerated and Inexact Forward-Backward Algorithms
- Subspace Correction Methods for a Class of Nonsmooth and Nonadditive Convex Variational Problems with Mixed $L^1/L^2$ Data-Fidelity in Image Processing
- Infimal Convolution of Data Discrepancies for Mixed Noise Removal
- Iterative regularization with a general penalty term—theory and application to L 1 and TV regularization
- Splitting forward-backward penalty scheme for constrained variational problems
- Coupling Forward-Backward with Penalty Schemes and Parallel Splitting for Constrained Variational Inequalities
- Stability of Over-Relaxations for the Forward-Backward Algorithm, Application to FISTA
- Support Vector Machines
- Probing the Pareto Frontier for Basis Pursuit Solutions
- Iterative methods for nonlinear ill-posed problems in Banach spaces: convergence and applications to parameter identification problems
- Iterative total variation schemes for nonlinear inverse problems
- On the Minimizing Property of a Second Order Dissipative System in Hilbert Spaces
- Asymptotic behavior of nonautonomous monotone and subgradient evolution equations
- The Differential Inclusion Modeling FISTA Algorithm and Optimality of Convergence Rate in the Case b $\leq3$
- Viscosity Solutions of Minimization Problems
- Convergence rates for an iteratively regularized Newton–Landweber iteration in Banach space
- A Dynamical Approach to an Inertial Forward-Backward Algorithm for Convex Minimization
- Rate of convergence of the Nesterov accelerated gradient method in the subcritical case α ≤ 3
- A Guide to the TV Zoo
- Stein Unbiased GrAdient estimator of the Risk (SUGAR) for Multiple Parameter Selection
- Modern regularization methods for inverse problems
- Regularization and Variable Selection Via the Elastic Net
- Model Selection and Estimation in Regression with Grouped Variables
- Signal Recovery by Proximal Forward-Backward Splitting
- Total Generalized Variation
- Convex analysis and monotone operator theory in Hilbert spaces
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Accelerated Iterative Regularization via Dual Diagonal Descent