Pages that link to "Item:Q1928276"
From MaRDI portal
The following pages link to The convex geometry of linear inverse problems (Q1928276):
Displaying 50 items.
- Convex optimization on Banach spaces (Q285434) (← links)
- A geometrical stability condition for compressed sensing (Q286166) (← links)
- Regularized linear system identification using atomic, nuclear and kernel-based norms: the role of the stability constraint (Q286265) (← links)
- Self-scaled bounds for atomic cone ranks: applications to nonnegative rank and cp-rank (Q304255) (← links)
- Geometric inference for general high-dimensional linear inverse problems (Q309721) (← links)
- Sharp MSE bounds for proximal denoising (Q330102) (← links)
- Low rank matrix recovery from rank one measurements (Q347516) (← links)
- Simple bounds for recovering low-complexity models (Q378116) (← links)
- Sharp recovery bounds for convex demixing, with applications (Q404302) (← links)
- Kernel methods in system identification, machine learning and function estimation: a survey (Q462325) (← links)
- A new perspective on least squares under convex constraint (Q482891) (← links)
- Greedy expansions in convex optimization (Q483395) (← links)
- On the solution uniqueness characterization in the L1 norm and polyhedral gauge recovery (Q511961) (← links)
- Decomposable norm minimization with proximal-gradient homotopy algorithm (Q513723) (← links)
- Subspace clustering by \((k,k)\)-sparse matrix factorization (Q524727) (← links)
- Tightness of the maximum likelihood semidefinite relaxation for angular synchronization (Q526833) (← links)
- Robust recovery of complex exponential signals from random Gaussian projections via low rank Hankel matrix reconstruction (Q739472) (← links)
- Generalization bounds for learning with linear, polygonal, quadratic and conic side knowledge (Q747246) (← links)
- Preserving injectivity under subgaussian mappings and its application to compressed sensing (Q778017) (← links)
- Optimizing optimization: accurate detection of hidden interactions in active body systems from noisy data (Q783431) (← links)
- Tuning complexity in regularized kernel-based regression and linear system identification: the robustness of the marginal likelihood estimator (Q895270) (← links)
- Analysis \(\ell_1\)-recovery with frames and Gaussian measurements (Q904283) (← links)
- From compression to compressed sensing (Q905909) (← links)
- A perturbation inequality for concave functions of singular values and its applications in low-rank matrix recovery (Q905912) (← links)
- Improved bounds for sparse recovery from subsampled random convolutions (Q1634177) (← links)
- Compressed sensing of data with a known distribution (Q1669057) (← links)
- The minimal measurement number for low-rank matrix recovery (Q1690711) (← links)
- Linear regression with sparsely permuted data (Q1711600) (← links)
- Optimal rates of statistical seriation (Q1715546) (← links)
- Learning semidefinite regularizers (Q1740575) (← links)
- Stable recovery of low-dimensional cones in Hilbert spaces: one RIP to rule them all (Q1748256) (← links)
- Sharp oracle inequalities for least squares estimators in shape restricted regression (Q1750286) (← links)
- System identification using kernel-based regularization: new insights on stability and consistency issues (Q1797024) (← links)
- A convex variational model for learning convolutional image atoms from incomplete data (Q1988356) (← links)
- Stable separation and super-resolution of mixture models (Q1990966) (← links)
- Energy on spheres and discreteness of minimizing measures (Q2020072) (← links)
- Riemannian gradient descent methods for graph-regularized matrix completion (Q2029849) (← links)
- Tensor theta norms and low rank recovery (Q2048814) (← links)
- Provably optimal sparse solutions to overdetermined linear systems with non-negativity constraints in a least-squares sense by implicit enumeration (Q2069147) (← links)
- Efficient proximal mapping computation for low-rank inducing norms (Q2073049) (← links)
- Super-resolution for doubly-dispersive channel estimation (Q2073137) (← links)
- On the robustness of minimum norm interpolators and regularized empirical risk minimizers (Q2091842) (← links)
- Proof methods for robust low-rank matrix recovery (Q2106469) (← links)
- Analysis of sparse recovery algorithms via the replica method (Q2106475) (← links)
- Hierarchical isometry properties of hierarchical measurements (Q2118397) (← links)
- Fundamental barriers to high-dimensional regression with convex penalties (Q2119224) (← links)
- Noisy tensor completion via the sum-of-squares hierarchy (Q2144539) (← links)
- The restricted isometry property of block diagonal matrices for group-sparse signal recovery (Q2155809) (← links)
- Biorthogonal greedy algorithms in convex optimization (Q2155817) (← links)
- Screening for a reweighted penalized conditional gradient method (Q2165597) (← links)