Fast iterative regularization by reusing data
From MaRDI portal
Publication:6583087
DOI10.1515/JIIP-2023-0009MaRDI QIDQ6583087FDOQ6583087
Authors: Cristian Vega, Cesare Molinari, Lorenzo Rosasco, Silvia Villa
Publication date: 6 August 2024
Published in: Journal of Inverse and Ill-posed Problems (Search for Journal in Brave)
Recommendations
iterative regularizationLandweber methodearly stoppingstability and convergence analysisprimal-dual splitting algorithms
Numerical optimization and variational techniques (65K10) Convex programming (90C25) Numerical methods involving duality (49M29)
Cites Work
- NESTA: A fast and accurate first-order method for sparse recovery
- Nonlinear total variation based noise removal algorithms
- A randomized Kaczmarz algorithm with exponential convergence
- Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?
- Generalized Cross-Validation as a Method for Choosing a Good Ridge Parameter
- Regularization and Variable Selection Via the Elastic Net
- Title not available (Why is that?)
- A Singular Value Thresholding Algorithm for Matrix Completion
- Support Vector Machines
- Boosting with early stopping: convergence and consistency
- On early stopping in gradient descent learning
- Exact matrix completion via convex optimization
- Mirror descent and nonlinear projected subgradient methods for convex optimization.
- Feature-Oriented Image Enhancement Using Shock Filters
- Title not available (Why is that?)
- Understanding machine learning. From theory to algorithms
- On the adaptive elastic net with a diverging number of parameters
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- Compressed sensing
- Signal Recovery by Proximal Forward-Backward Splitting
- Image recovery via total variation minimization and related problems
- An algorithm for total variation minimization and applications
- A first-order primal-dual algorithm for convex problems with applications to imaging
- A mathematical introduction to compressive sensing
- Title not available (Why is that?)
- Bregman Iterative Algorithms for $\ell_1$-Minimization with Applications to Compressed Sensing
- An Iterative Regularization Method for Total Variation-Based Image Restoration
- Iterative regularization methods for nonlinear ill-posed problems
- A modified Landweber iteration for solving parameter estimation problems
- A splitting algorithm for dual monotone inclusions involving cocoercive operators
- Dual averaging methods for regularized stochastic learning and online optimization
- A primal-dual splitting method for convex optimization involving Lipschitzian, proximable and linear composite terms
- Title not available (Why is that?)
- An Iteration Formula for Fredholm Integral Equations of the First Kind
- Linearized Bregman Iterations for Frame-Based Image Deblurring
- Efficient online and batch learning using forward backward splitting
- Forward-Douglas–Rachford splitting and forward-partial inverse method for solving monotone inclusions
- A Douglas-Rachford splitting method for solving equilibrium problems
- Convex analysis and monotone operator theory in Hilbert spaces
- Extensions of compressed sensing
- Early stopping and non-parametric regression: an optimal data-dependent stopping rule
- Iterative total variation schemes for nonlinear inverse problems
- On regularization algorithms in learning theory
- A generalized forward-backward splitting
- Geometric approach to error-correcting codes and reconstruction of signals
- Elastic-net regularization in learning theory
- Analysis and generalizations of the linearized Bregman method
- Fast linearized Bregman iteration for compressive sensing and sparse denoising
- Inexact first-order primal-dual algorithms
- A projected primal-dual method for solving constrained monotone inclusions
- Error estimation for Bregman iterations and inverse scale space methods in image restoration
- Convergence rates and source conditions for Tikhonov regularization with sparsity constraints
- AdaBoost is consistent
- A stochastic Bregman primal-dual splitting algorithm for composite optimization
- Convergence of diagonally stationary sequences in convex optimization
- Elastic-net regularization: error estimates and active set methods
- Lagrangian penalization scheme with parallel forward-backward splitting
- Stochastic Primal-Dual Hybrid Gradient Algorithm with Arbitrary Sampling and Imaging Applications
- Convergence properties of a randomized primal-dual algorithm with applications to parallel MRI
- On Nesterov acceleration for Landweber iteration of linear ill-posed problems
- Iterative regularization via dual diagonal descent
- Iterative regularization with a general penalty term-theory and application to \(L^{1}\) and \(TV\) regularization
- Linear convergence of the randomized sparse Kaczmarz method
- Modern regularization methods for inverse problems
- Error estimates for general fidelities
- Random activations in primal-dual splittings for monotone inclusions with a priori information
- Alternating forward-backward splitting for linearly constrained optimization problems
- Convergence rates of forward-Douglas-Rachford splitting method
- On the convergence of stochastic primal-dual hybrid gradient
- Accelerated iterative regularization via dual diagonal descent
- Generalized conditional gradient with augmented Lagrangian for composite minimization
- Inexact and stochastic generalized conditional gradient with augmented Lagrangian and proximal step
Cited In (1)
This page was built for publication: Fast iterative regularization by reusing data
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6583087)