Iterative hard thresholding for compressed sensing

From MaRDI portal
Revision as of 11:19, 30 January 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:734323


DOI10.1016/j.acha.2009.04.002zbMath1174.94008arXiv0805.0510MaRDI QIDQ734323

Thomas Blumensath, Michael E. Davies

Publication date: 20 October 2009

Published in: Applied and Computational Harmonic Analysis (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/0805.0510


94A13: Detection theory in information and communication theory

94A20: Sampling theory in information and communication theory


Related Items

Unnamed Item, Unnamed Item, Multicompartment magnetic resonance fingerprinting, CGIHT: conjugate gradient iterative hard thresholding for compressed sensing and matrix completion, Weighted ${\ell}_{{1}}$-minimization for sparse recovery under arbitrary prior information, Near-optimal estimation of simultaneously sparse and low-rank matrices from nested linear measurements, Polynomial approximation via compressed sensing of high-dimensional functions on lower sets, A Tight Bound of Hard Thresholding, Fusion frames and distributed sparsity, Spectral Compressed Sensing via Projected Gradient Descent, The Sparse MLE for Ultrahigh-Dimensional Feature Screening, An adaptive inverse scale space method for compressed sensing, $\ell _0$ Minimization for wavelet frame based image restoration, Proximal Heterogeneous Block Implicit-Explicit Method and Application to Blind Ptychographic Diffraction Imaging, Quasi-linear Compressed Sensing, Minimization of $\ell_{1-2}$ for Compressed Sensing, A proximal point method for the sum of maximal monotone operators, A Survey of Compressed Sensing, Quantization and Compressive Sensing, Compressive Gaussian Mixture Estimation, Structured Sparsity: Discrete and Convex Approaches, Tensor Completion in Hierarchical Tensor Representations, Sparse estimation of Cox proportional hazards models via approximated information criteria, A note on the complexity of proximal iterative hard thresholding algorithm, Best subset selection via a modern optimization lens, Relationship between the optimal solutions of least squares regularized with \(\ell_{0}\)-norm and constrained by \(k\)-sparsity, Minimum \( n\)-rank approximation via iterative hard thresholding, Recent development of dual-dictionary learning approach in medical image analysis and reconstruction, Conjugate gradient acceleration of iteratively re-weighted least squares methods, Design of wideband fractional-order differentiator using interlaced sampling method, Error bounds and stability in the \(\ell_0\) regularized for CT reconstruction from small projections, Robust sparse phase retrieval made easy, Reweighted \(\ell_1\) minimization method for stochastic elliptic differential equations, The essential ability of sparse reconstruction of different compressive sensing strategies, Sparse microwave imaging: principles and applications, Compressed sensing SAR imaging based on sparse representation in fractional Fourier domain, Restricted isometries for partial random circulant matrices, Sparse Legendre expansions via \(\ell_1\)-minimization, Compressive sensing of analog signals using discrete prolate spheroidal sequences, Iterative hard thresholding methods for \(l_0\) regularized convex cone programming, Iterative reweighted minimization methods for \(l_p\) regularized unconstrained nonlinear programming, Optimal computational and statistical rates of convergence for sparse nonconvex learning problems, Tensor networks and hierarchical tensors for the solution of high-dimensional partial differential equations, Sparse signal recovery using a new class of random matrices, Convergence of fixed-point continuation algorithms for matrix rank minimization, Gradient iteration with \(\ell _{p}\)-norm constraints, A non-adapted sparse approximation of PDEs with stochastic inputs, Compressed sensing with coherent and redundant dictionaries, Phase transitions for greedy sparse approximation algorithms, Projected gradient iteration for nonlinear operator equation, Democracy in action: quantization, saturation, and compressive sensing, A note on the complexity of \(L _{p }\) minimization, On support sizes of restricted isometry constants, Iterative hard thresholding for compressed sensing, Iterative hard thresholding based on randomized Kaczmarz method, Hard thresholding pursuit algorithms: number of iterations, GPU accelerated greedy algorithms for compressed sensing, A primal dual active set with continuation algorithm for the \(\ell^0\)-regularized optimization problem, Efficient nonconvex sparse group feature selection via continuous and discrete optimization, Adaptive projected gradient thresholding methods for constrained \(l_0\) problems, Sampling in the analysis transform domain, Interpolation via weighted \(\ell_{1}\) minimization, MOEA/D with chain-based random local search for sparse optimization, Observable dictionary learning for high-dimensional statistical inference, Expander \(\ell_0\)-decoding, Recovery of block sparse signals under the conditions on block RIC and ROC by BOMP and BOMMP, Existence and convergence analysis of \(\ell_{0}\) and \(\ell_{2}\) regularizations for limited-angle CT reconstruction, A globally convergent algorithm for nonconvex optimization based on block coordinate update, An iterative support shrinking algorithm for non-Lipschitz optimization in image restoration, A new piecewise quadratic approximation approach for \(L_0\) norm minimization problem, Efficient projected gradient methods for cardinality constrained optimization, Learning semidefinite regularizers, Convergence radius and sample complexity of ITKM algorithms for dictionary learning, A non-smooth and non-convex regularization method for limited-angle CT image reconstruction, Capped \(\ell_p\) approximations for the composite \(\ell_0\) regularization problem, Approximately normalized iterative hard thresholding for nonlinear compressive sensing, Compressive sensing in signal processing: algorithms and transform domain formulations, Broken adaptive ridge regression and its asymptotic properties, Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods, Sparse signal recovery via ECME thresholding pursuits, Fast and provable algorithms for spectrally sparse signal reconstruction via low-rank Hankel matrix completion, Sparse signal inversion with impulsive noise by dual spectral projected gradient method, Matrix recipes for hard thresholding methods, An evaluation of the sparsity degree for sparse recovery with deterministic measurement matrices, Spectral compressive sensing, Compressed sensing with sparse binary matrices: instance optimal error guarantees in near-optimal time, Convergence of projected Landweber iteration for matrix rank minimization, Fast thresholding algorithms with feedbacks for sparse signal recovery, Sparse recovery with coherent tight frames via analysis Dantzig selector and analysis LASSO, Greedy approximation in convex optimization, Near oracle performance and block analysis of signal space greedy methods, Greedy signal space methods for incoherence and beyond, Sparsity optimization in design of multidimensional filter networks, Improved sparse Fourier approximation results: Faster implementations and stronger guarantees, Low rank tensor recovery via iterative hard thresholding, Adaptive step-size matching pursuit algorithm for practical sparse reconstruction, Quantization of compressive samples with stable and robust recovery, Generalized sparse recovery model and its neural dynamical optimization method for compressed sensing, Submodular functions: from discrete to continuous domains, Non-iterative CS recovery algorithm for surveillance applications: subjective and real-time experience, Greedy-like algorithms for the cosparse analysis model, Bounds of restricted isometry constants in extreme asymptotics: formulae for Gaussian matrices, Wavelet optimal estimations for a density with some additive noises, Optimized projections for compressed sensing via rank-constrained nearest correlation matrix, Fast and RIP-optimal transforms, Convergence analysis of projected gradient descent for Schatten-\(p\) nonconvex matrix recovery, Nonconvex sorted \(\ell_1\) minimization for sparse approximation, Nonlinear regularization techniques for seismic tomography, Compressive Sensing, Duality and Convex Programming, Low Complexity Regularization of Linear Inverse Problems, Noise-Shaping Quantization Methods for Frame-Based and Compressive Sampling Systems, On the Minimization Over Sparse Symmetric Sets: Projections, Optimality Conditions, and Algorithms, Guarantees of Riemannian Optimization for Low Rank Matrix Recovery, Suprema of Chaos Processes and the Restricted Isometry Property, Performance comparisons of greedy algorithms in compressed sensing, A Generalized Class of Hard Thresholding Algorithms for Sparse Signal Recovery, Average Performance of the Sparsest Approximation Using a General Dictionary, Multi-receivers and sparse-pixel pseudo-thermal light source for compressive ghost imaging against turbulence, Fast Phase Retrieval from Local Correlation Measurements, Fast sparse reconstruction: Greedy inverse scale space flows, Sparsity Based Methods for Overparameterized Variational Problems


Uses Software


Cites Work