Linear convergence of iterative soft-thresholding
From MaRDI portal
Publication:734951
DOI10.1007/s00041-008-9041-1zbMath1175.65061arXiv0709.1598OpenAlexW3101710822WikidataQ57562038 ScholiaQ57562038MaRDI QIDQ734951
Dirk A. Lorenz, Kristian Bredies
Publication date: 14 October 2009
Published in: The Journal of Fourier Analysis and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/0709.1598
convergenceHilbert spaceslinear operator equationsgradient projection methodgeneralized gradient methodsiterative soft-thresholding algorithms
Numerical solutions to equations with linear operators (65J10) Equations and inequalities involving linear operators, with vector unknowns (47A50)
Related Items
Proximal Splitting Methods in Signal Processing, Sparse Online Variational Bayesian Regression, Recursive SURE for iterative reweighted least square algorithms, Non-convex sparse regularisation, Sensitivity Analysis for Mirror-Stratifiable Convex Functions, Iterative positive thresholding algorithm for non-negative sparse optimization, Influence factors of sparse microwave imaging radar system performance: approaches to waveform design and platform motion analysis, Iterative methods based on soft thresholding of hierarchical tensors, Adaptive wavelet methods and sparsity reconstruction for inverse heat conduction problems, Quadratic growth conditions and uniqueness of optimal solution to Lasso, Dualization of signal recovery problems, Stability of the elastic net estimator, Lagrangian approach and shape gradient for inverse problem of breaking line identification in solid: contact with adhesion, Convergence rates of gradient methods for convex optimization in the space of measures, Thresholding gradient methods in Hilbert spaces: support identification and linear convergence, A simple homotopy proximal mapping algorithm for compressive sensing, Convergence Rate Analysis of Accelerated Forward-Backward Algorithm with Generalized Nesterov Momentum Scheme, Convergence of the forward-backward algorithm: beyond the worst-case with the help of geometry, Projected gradient iteration for nonlinear operator equation, Activity Identification and Local Linear Convergence of Forward--Backward-type Methods, Descent gradient methods for nonsmooth minimization problems in ill-posed problems, Flexible sparse regularization, Optimization methods for regularization-based ill-posed problems: a survey and a multi-objective framework, Optimal convergence rates for sparsity promoting wavelet-regularization in Besov spaces, Low-rank representation-based object tracking using multitask feature learning with joint sparsity, Joint super-resolution image reconstruction and parameter identification in imaging operator: analysis of bilinear operator equations, numerical solution, and application to magnetic particle imaging, An ADMM algorithm for second-order TV-based MR image reconstruction, A note on the minimization of a Tikhonov functional with ℓ1-penalty, A Barzilai-Borwein type method for minimizing composite functions, Fast thresholding algorithms with feedbacks for sparse signal recovery, A Projection Proximal-Point Algorithm for ℓ1Minimization, Measurement matrix optimization via mutual coherence minimization for compressively sensed signals reconstruction, IMRO: A Proximal Quasi-Newton Method for Solving $\ell_1$-Regularized Least Squares Problems, An introduction to continuous optimization for imaging, Gradient iteration with \(\ell _{p}\)-norm constraints, Local linear convergence analysis of Primal–Dual splitting methods, Local and global convergence of a general inertial proximal splitting scheme for minimizing composite functions, Iterative thresholding compressed sensing MRI based on contourlet transform, Linear convergence of inexact descent method and inexact proximal gradient algorithms for lower-order regularization problems, On the linear convergence of forward-backward splitting method. I: Convergence analysis, Modified Fejér sequences and applications, An iterative thresholding-like algorithm for inverse problems with sparsity constraints in Banach space, Far Field Splitting by Iteratively Reweighted $\ell^1$ Minimization, Low Complexity Regularization of Linear Inverse Problems, Necessary and sufficient conditions for linear convergence of ℓ1-regularization, Elastic-Net Regularization: Iterative Algorithms and Asymptotic Behavior of Solutions, Variable Metric Inexact Line-Search-Based Methods for Nonsmooth Optimization, Efficient dictionary learning with sparseness-enforcing projections, Modified iterations for data-sparse solution of linear systems, Sparse frame DOA estimations via a rank-one correlation model for low SNR and limited snapshots, Fast global convergence of gradient methods for high-dimensional statistical recovery, Sparse group fused Lasso for model segmentation: a hybrid approach, Scaling Techniques for $\epsilon$-Subgradient Methods, Resonator Networks, 2: Factorization Performance and Capacity Compared to Optimization-Based Methods, Convergence rates of forward-Douglas-Rachford splitting method, Frame soft shrinkage operators are proximity operators, Eigenvalue-free iterative shrinkage-thresholding algorithm for solving the linear inverse problems, Multilevel preconditioning and adaptive sparse solution of inverse problems, Deep Neural Networks for Inverse Problems with Pseudodifferential Operators: An Application to Limited-Angle Tomography, Minimization of non-smooth, non-convex functionals by iterative thresholding, An adaptive accelerated proximal gradient method and its homotopy continuation for sparse optimization
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Accelerated projected gradient method for linear inverse problems with sparsity constraints
- A generalized conditional gradient method and its connection to an iterative shrinkage method
- Highly sparse representations from dictionaries are unique and independent of the sparseness measure
- Characteristic inequalities of uniformly convex and uniformly smooth Banach spaces
- Least angle regression. (With discussion)
- Global uniqueness for a two-dimensional inverse boundary value problem
- Coordinate and subspace optimization methods for linear least squares with non-quadratic regularization
- Nonlinear iterative methods for linear ill-posed problems in Banach spaces
- A semismooth Newton method for Tikhonov functionals with sparsity constraints
- Convergence rates and source conditions for Tikhonov regularization with sparsity constraints
- Stable recovery of sparse overcomplete representations in the presence of noise
- Recovery of Exact Sparse Representations in the Presence of Bounded Noise
- Iterated Hard Shrinkage for Minimization Problems with Sparsity Constraints
- Recovery Algorithms for Vector-Valued Data with Joint Sparsity Constraints
- Global and Asymptotic Convergence Rate Estimates for a Class of Projected Gradient Processes
- The Interior Radon Transform
- Bregman Monotone Optimization Algorithms
- A new approach to variable selection in least squares problems
- An iterative thresholding algorithm for linear inverse problems with a sparsity constraint
- An iterative algorithm for nonlinear inverse problems with joint sparsity constraints in vector-valued regimes and an application to color image inpainting
- Signal Recovery by Proximal Forward-Backward Splitting
- Convex programming in Hilbert space
- Regularization of ill-posed problems in Banach spaces: convergence rates