Pages that link to "Item:Q834335"
From MaRDI portal
The following pages link to Near-ideal model selection by \(\ell _{1}\) minimization (Q834335):
Displaying 50 items.
- Best subset selection via a modern optimization lens (Q282479) (← links)
- SLOPE is adaptive to unknown sparsity and asymptotically minimax (Q292875) (← links)
- Conjugate gradient acceleration of iteratively re-weighted least squares methods (Q316180) (← links)
- Deterministic convolutional compressed sensing matrices (Q324274) (← links)
- Non-negative least squares for high-dimensional linear models: consistency and sparse recovery without regularization (Q391843) (← links)
- Phase transition in limiting distributions of coherence of high-dimensional random matrices (Q413738) (← links)
- Sharp support recovery from noisy random measurements by \(\ell_1\)-minimization (Q427066) (← links)
- Two are better than one: fundamental parameters of frame coherence (Q427070) (← links)
- Invertibility of random submatrices via tail-decoupling and a matrix Chernoff inequality (Q449436) (← links)
- UPS delivers optimal phase diagram in high-dimensional variable selection (Q450021) (← links)
- Covariate assisted screening and estimation (Q482879) (← links)
- Normalized and standard Dantzig estimators: two approaches (Q491397) (← links)
- Reconstructing DNA copy number by penalized estimation and imputation (Q542936) (← links)
- Compressed sensing with coherent and redundant dictionaries (Q544040) (← links)
- \(\ell_{1}\)-penalization for mixture regression models (Q619141) (← links)
- Adaptive Dantzig density estimation (Q629798) (← links)
- Limiting laws of coherence of random matrices with applications to testing covariance structure and construction of compressed sensing matrices (Q638800) (← links)
- A necessary and sufficient condition for exact sparse recovery by \(\ell_1\) minimization (Q664941) (← links)
- Analysis of sparse MIMO radar (Q741258) (← links)
- Group sparse optimization for learning predictive state representations (Q778374) (← links)
- Controlling the false discovery rate via knockoffs (Q888503) (← links)
- Optimal dual certificates for noise robustness bounds in compressive sensing (Q892815) (← links)
- High-dimensional Gaussian model selection on a Gaussian design (Q985331) (← links)
- LOL selection in high dimension (Q1621355) (← links)
- On the sensitivity of the Lasso to the number of predictor variables (Q1790389) (← links)
- Space alternating penalized Kullback proximal point algorithms for maximizing likelihood with nondifferentiable penalty (Q1926006) (← links)
- \(\ell _{1}\)-regularized linear regression: persistence and oracle inequalities (Q1930861) (← links)
- Compressed sensing and matrix completion with constant proportion of corruptions (Q1939501) (← links)
- Minimax risks for sparse regressions: ultra-high dimensional phenomenons (Q1950804) (← links)
- The Lasso problem and uniqueness (Q1951165) (← links)
- Honest variable selection in linear and logistic regression models via \(\ell _{1}\) and \(\ell _{1}+\ell _{2}\) penalization (Q1951794) (← links)
- On the conditions used to prove oracle results for the Lasso (Q1952029) (← links)
- The adaptive and the thresholded Lasso for potentially misspecified models (and a lower bound for the Lasso) (Q1952206) (← links)
- The generalized Lasso problem and uniqueness (Q2002568) (← links)
- Sampling from non-smooth distributions through Langevin diffusion (Q2065460) (← links)
- Inadequacy of linear methods for minimal sensor placement and feature selection in nonlinear systems: a new approach using secants (Q2163754) (← links)
- Adaptive decomposition-based evolutionary approach for multiobjective sparse reconstruction (Q2198239) (← links)
- Statistical analysis of sparse approximate factor models (Q2199708) (← links)
- A significance test for the lasso (Q2249837) (← links)
- Discussion: ``A significance test for the lasso'' (Q2249838) (← links)
- Rejoinder: ``A significance test for the lasso'' (Q2249839) (← links)
- Pivotal estimation via square-root lasso in nonparametric regression (Q2249850) (← links)
- A global homogeneity test for high-dimensional linear regression (Q2263711) (← links)
- Prediction error bounds for linear regression with the TREX (Q2273161) (← links)
- Sharp oracle inequalities for low-complexity priors (Q2304249) (← links)
- Localized Gaussian width of \(M\)-convex hulls with applications to Lasso and convex aggregation (Q2325349) (← links)
- Refined analysis of sparse MIMO radar (Q2360571) (← links)
- High-dimensional change-point estimation: combining filtering with convex optimization (Q2397167) (← links)
- Error bounds for compressed sensing algorithms with group sparsity: A unified approach (Q2399645) (← links)
- The degrees of freedom of partly smooth regularizers (Q2409395) (← links)