DOI10.1002/cpa.20132zbMath1113.15004OpenAlexW2050834445WikidataQ29392029 ScholiaQ29392029MaRDI QIDQ5471032
David L. Donoho
Publication date: 6 June 2006
Published in: Communications on Pure and Applied Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1002/cpa.20132
Sparse network identifiability via compressed sensing,
Two-stage convex relaxation approach to least squares loss constrained low-rank plus sparsity optimization problems,
Best subset selection via a modern optimization lens,
Sparse solutions to an underdetermined system of linear equations via penalized Huber loss,
Rational sphere maps, linear programming, and compressed sensing,
Regularized linear system identification using atomic, nuclear and kernel-based norms: the role of the stability constraint,
Minimizers of sparsity regularized Huber loss function,
High-dimensional variable selection,
Image reconstruction from undersampled Fourier data using the polynomial annihilation transform,
Stable high-order cubature formulas for experimental data,
Computation of sparse and dense equilibrium strategies of evolutionary games,
Sparsity in penalized empirical risk minimization,
Comparison of support-vector machine and sparse representation using a modified rule-based method for automated myocardial ischemia detection,
Matrix sparsification and the sparse null space problem,
Binomiality testing and computing sparse polynomials via witness sets,
Regularity properties of non-negative sparsity sets,
Random sampling of sparse trigonometric polynomials,
Data shared Lasso: a novel tool to discover uplift,
Best subset selection, persistence in high-dimensional statistical learning and optimization under \(l_1\) constraint,
Linearized alternating direction method with adaptive penalty and warm starts for fast solving transform invariant low-rank textures,
Extended randomized Kaczmarz method for sparse least squares and impulsive noise problems,
Recovery of high-dimensional sparse signals via \(\ell_1\)-minimization,
How can we identify the sparsity structure pattern of high-dimensional data: an elementary statistical analysis to interpretable machine learning,
ParNes: A rapidly convergent algorithm for accurate recovery of sparse and approximately sparse signals,
The Dantzig selector and sparsity oracle inequalities,
Approximation accuracy, gradient methods, and error bound for structured convex optimization,
Noise-robust semi-supervised learning via fast sparse coding,
Robust alternating low-rank representation by joint \(L_p\)- and \(L_{2,p}\)-norm minimization,
The convex geometry of linear inverse problems,
Feature selection when there are many influential features,
Transformed \(\ell_1\) regularization for learning sparse deep neural networks,
A selective overview of feature screening for ultrahigh-dimensional data,
A smoothing method for sparse optimization over convex sets,
Geometric separation of singularities using combined multiscale dictionaries,
Compressed sensing and matrix completion with constant proportion of corruptions,
Non-negative and sparse spectral clustering,
Image reconstruction from Fourier data using sparsity of edges,
Discussion: Latent variable graphical model selection via convex optimization,
Rejoinder: Latent variable graphical model selection via convex optimization,
Domain adaptation for face recognition: targetize source domain bridged by common subspace,
The residual method for regularizing ill-posed problems,
Sharp support recovery from noisy random measurements by \(\ell_1\)-minimization,
On discrete \(\ell ^{1}\)-regularization,
Smoothed \(\ell_1\)-regularization-based line search for sparse signal recovery,
Criterion for signal extraction in underdetermined mixtures of bounded support,
The Lasso problem and uniqueness,
Limiting laws of coherence of random matrices with applications to testing covariance structure and construction of compressed sensing matrices,
New nonsmooth equations-based algorithms for \(\ell_1\)-norm minimization and applications,
Geometric separation in \(\mathbb{R}^3\),
A note on the complexity of \(L _{p }\) minimization,
Compressed sensing of color images,
An iterative support shrinking algorithm for non-Lipschitz optimization in image restoration,
Equivalence of minimal \(\ell _{0}\)- and \(\ell _{p }\)-norm solutions of linear equalities, inequalities and linear programs for sufficiently small \(p\),
Beyond sparsity: the role of \(L_{1}\)-optimizer in pattern classification,
Unified approach to coefficient-based regularized regression,
Nonmonotone adaptive Barzilai-Borwein gradient algorithm for compressed sensing,
Robust face recognition via block sparse Bayesian learning,
\(l_p\)-recovery of the most significant subspace among multiple subspaces with outliers,
Bregman iteration algorithm for sparse nonnegative matrix factorizations via alternating \(l_1\)-norm minimization,
An overview on the applications of matrix theory in wireless communications and signal processing,
Blind identification of fully observed linear time-varying systems via sparse recovery,
Learning semidefinite regularizers,
A domain decomposition Fourier continuation method for enhanced \(L_1\) regularization using sparsity of edges in reconstructing Fourier data,
Stable recovery of low-dimensional cones in Hilbert spaces: one RIP to rule them all,
A probabilistic learning algorithm for robust modeling using neural networks with random weights,
Dictionary evaluation and optimization for sparse coding based speech processing,
Sparse decomposition by iterating Lipschitzian-type mappings,
Reconstructed error and linear representation coefficients restricted by \(\ell_1\)-minimization for face recognition under different illumination and occlusion,
The noncooperative transportation problem and linear generalized Nash games,
An iteratively approximated gradient projection algorithm for sparse signal reconstruction,
Identification of switched linear systems via sparse optimization,
The generalized Lasso problem and uniqueness,
Restricted \(p\)-isometry property and its application for nonconvex compressive sensing,
A novel robust principal component analysis algorithm of nonconvex rank approximation,
Sparse approximate solution of partial differential equations,
Consistent tuning parameter selection in high dimensional sparse linear regression,
Some empirical advances in matrix completion,
Signal recovery by discrete approximation and a Prony-like method,
Complexity and applications of the homotopy principle for uniformly constrained sparse minimization,
Parameter selection for nonnegative $l_1$ matrix/tensor sparse decomposition,
Gelfand numbers related to structured sparsity and Besov space embeddings with small mixed smoothness,
Latent variable graphical model selection via convex optimization,
Consistencies and rates of convergence of jump-penalized least squares estimators,
Lasso-type recovery of sparse representations for high-dimensional data,
Analysis of basis pursuit via capacity sets,
An optimization problem arising in CR geometry,
Augmented sparse reconstruction of protein signaling networks,
Overcomplete blind source separation based on generalized Gaussian function and SL0 norm,
Efficient dictionary learning with sparseness-enforcing projections,
Image reconstruction for diffuse optical tomography based on radiative transfer equation,
A Laplacian approach to \(\ell_1\)-norm minimization,
Sparse recovery in convex hulls via entropy penalization,
Sampling from non-smooth distributions through Langevin diffusion,
Random sampling of sparse trigonometric polynomials. II: Orthogonal matching pursuit versus basis pursuit,
Random projections of smooth manifolds,
Robust sparse recovery via a novel convex model,
On a simple derivation of the complementary matching pursuit,
Information criteria bias correction for group selection,
Atmospheric radar imaging improvements using compressed sensing and MIMO,
The first-order necessary conditions for sparsity constrained optimization,
A regularized interior-point method for constrained linear least squares,
Gibbs phenomena for Lq-best approximation in finite element spaces,
Convergence and regularization results for optimal control problems with sparsity functional,
A general theory for subspace-sparse recovery,
Compressive Sensing with Cross-Validation and Stop-Sampling for Sparse Polynomial Chaos Expansions,
Efficient Blind Compressed Sensing Using Sparsifying Transforms with Convergence Guarantees and Application to Magnetic Resonance Imaging,
The asymptotic distribution and Berry-Esseen bound of a new test for independence in high dimension with an application to stochastic optimization,
Deep Learning--Based Dictionary Learning and Tomographic Image Reconstruction,
A component lasso,
Model Selection With Lasso-Zero: Adding Straw to the Haystack to Better Find Needles,
Unnamed Item,
Fiber Orientation Distribution Estimation Using a Peaceman--Rachford Splitting Method,
Classifiers of support vector machine type with \(\ell_1\) complexity regularization,
Sparsity Constrained Estimation in Image Processing and Computer Vision,
A Continuous Exact $\ell_0$ Penalty (CEL0) for Least Squares Regularized Problem,
Extracting Sparse High-Dimensional Dynamics from Limited Data,
Unnamed Item,
Nonlinear regularization techniques for seismic tomography,
Error bounds for compressed sensing algorithms with group sparsity: A unified approach,
A microscopic-view infection model based on linear systems,
An inexact continuation accelerated proximal gradient algorithm for lown-rank tensor recovery,
The degrees of freedom of partly smooth regularizers,
A Dai-Liao-type projection method for monotone nonlinear equations and signal processing,
Proximal linearization methods for Schatten \(p\)-quasi-norm minimization,
Adaptive spatial-spectral dictionary learning for hyperspectral image restoration,
Dynamic behavior analysis via structured rank minimization,
Flexible semi-supervised embedding based on adaptive loss regression: application to image categorization,
High-dimensional generalized linear models and the lasso,
A Unified View of Exact Continuous Penalties for $\ell_2$-$\ell_0$ Minimization,
Sparse estimation via lower-order penalty optimization methods in high-dimensional linear regression,
Statistical learning and selective inference,
A Barzilai-Borwein gradient projection method for sparse signal and blurred image restoration,
New insights on the optimality conditions of the \(\ell_2-\ell_0\) minimization problem,
Algorithms for Sparse Nonnegative Tucker Decompositions,
Compressive Sensing,
Sparse eigenbasis approximation: multiple feature extraction across spatiotemporal scales with application to coherent set identification,
Flavors of Compressive Sensing,
Inverse potential problems for divergence of measures with total variation regularization,
On a Scalable Entropic Breaching of the Overfitting Barrier for Small Data Problems in Machine Learning,
An unbiased approach to compressed sensing,
Concentration of \(S\)-largest mutilated vectors with \(\ell_p\)-quasinorm for \(0<p\leq 1\) and its applications,
Signal separation under coherent dictionaries and \(\ell_p\)-bounded noise,
A look at robustness and stability of \(\ell_1\)-versus \(\ell_0\)-regularization: discussion of papers by Bertsimas et al. and Hastie et al.,
Accelerating Sparse Recovery by Reducing Chatter,
Parameter estimation and signal reconstruction,
Consistency Analysis for Massively Inconsistent Datasets in Bound-to-Bound Data Collaboration,
Sparsity Promoting Hybrid Solvers for Hierarchical Bayesian Inverse Problems,
Leveraging Sparsity and Compressive Sensing for Reduced Order Modeling,
Microlocal Analysis of the Geometric Separation Problem,
Statistical challenges of high-dimensional data,
Unnamed Item,
Gaussian averages of interpolated bodies and applications to approximate reconstruction,
The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).,
Rodeo: Sparse, greedy nonparametric regression,
Geometric separation by single-pass alternating thresholding,
Sparse recovery with coherent tight frames via analysis Dantzig selector and analysis LASSO,
Ways to sparse representation: An overview,
Data science, big data and statistics,
On convex envelopes and regularization of non-convex functionals without moving global minima,
Introduction,
A novel hierarchical fingerprint matching approach,
Sparse regularization for semi-supervised classification,
Regression based automatic face annotation for deformable model building,
A Regularized Correntropy Framework for Robust Pattern Recognition,
OperA: Operator-Based Annihilation for Finite-Rate-of-Innovation Signal Sampling,
A gradient descent based algorithm for \(\ell_p\) minimization,
Lazy lasso for local regression,
The stochastic properties of \(\ell^1\)-regularized spherical Gaussian fields,
Sparse reconstructions from few noisy data: analysis of hierarchical Bayesian models with generalized gamma hyperpriors,
Tensor Least Angle Regression for Sparse Representations of Multidimensional Signals,
Sharp oracle inequalities for low-complexity priors,
An inexact augmented Lagrangian method for computing strongly orthogonal decompositions of tensors,
An active set Barzilar-Borwein algorithm for \(l_0\) regularized optimization,
Performance of algebraic multigrid methods for non-symmetric matrices arising in particle methods,
Sparse Sensor Placement Optimization for Classification,
Critical behavior and universality classes for an algorithmic phase transition in sparse reconstruction,
Regularized ranking with convex losses and \(\ell^1\)-penalty,
Alternating direction and Taylor expansion minimization algorithms for unconstrained nuclear norm optimization,
Identification of sparse FIR systems using a general quantisation scheme,
On perturbed steepest descent methods with inexact line search for bilevel convex optimization,
Unnamed Item,
Sharp sufficient conditions for stable recovery of block sparse signals by block orthogonal matching pursuit,
Iterative Hard-Thresholding Applied to Optimal Control Problems with $L^0(\Omega)$ Control Cost,
Necessary and Sufficient Conditions for Noiseless Sparse Recovery via Convex Quadratic Splines,
Orthogonal matched wavelets with vanishing moments: a sparsity design approach,
Outlier deletion based improvement on the stomp algorithm for sparse solution of large-scale underdetermined problems,
Discussion: Latent variable graphical model selection via convex optimization,
Discussion: Latent variable graphical model selection via convex optimization,
Discussion: Latent variable graphical model selection via convex optimization,
Discussion: Latent variable graphical model selection via convex optimization,
Discussion: Latent variable graphical model selection via convex optimization,
Unnamed Item,
Generalized cross validation in variable selection with and without shrinkage,
An accelerated majorization-minimization algorithm with convergence guarantee for non-Lipschitz wavelet synthesis model *,
Splitting methods with variable metric for Kurdyka-Ĺojasiewicz functions and general convergence rates,
KERNEL METHODS FOR INDEPENDENCE MEASUREMENT WITH COEFFICIENT CONSTRAINTS,
Decoding from Pooled Data: Sharp Information-Theoretic Bounds,
Optimal \(D\)-RIP bounds in compressed sensing,
Improved combination of RPCA and MEL for sparse representation-based face recognition,
Low rank matrix recovery with adversarial sparse noise*,
Sparse feedback stabilisation of linear delay systems by proximal gradient method,
Discussion on: âA Scale-Free Approach for False Discovery Rate Control in Generalized Linear Modelsâ by Dai, Lin, Zing, Liu,
Error Correction for Discrete Tomography,
CECM: a continuous empirical cubature method with application to the dimensional hyperreduction of parameterized finite element models,
Linear inverse problems with Hessian-Schatten total variation,
DRIP: deep regularizers for inverse problems,
Compressive phase retrieval: Optimal sample complexity with deep generative priors,
Sparse basis pursuit for compliance minimization in the vanishing volume ratio limit,
An Algebraic Estimator for Large Spectral Density Matrices,
Second order self-adaptive dynamical system for sparse signal reconstruction and applications to image recovery,
Inducing sparsity via the horseshoe prior in imaging problems