Sharp Oracle Inequalities for Square Root Regularization
From MaRDI portal
Publication:4636972
zbMath1441.62188arXiv1509.04093MaRDI QIDQ4636972
Benjamin Stucky, Sara van de Geer
Publication date: 17 April 2018
Full work available at URL: https://arxiv.org/abs/1509.04093
Ridge regression; shrinkage estimators (Lasso) (62J07) Linear regression; mixed models (62J05) Inequalities; stochastic orderings (60E15)
Related Items
Group sparse recovery via group square-root elastic net and the iterative multivariate thresholding-based algorithm, A dual semismooth Newton based augmented Lagrangian method for large-scale linearly constrained sparse group square-root Lasso problems, Correcting for unknown errors in sparse high-dimensional function approximation, A two-stage regularization method for variable selection and forecasting in high-order interaction model, Oracle Inequalities for Local and Global Empirical Risk Minimizers, Improved bounds for square-root Lasso and square-root slope, Unnamed Item, Unnamed Item
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Estimation and testing under sparsity. École d'Été de Probabilités de Saint-Flour XLV -- 2015
- SLOPE is adaptive to unknown sparsity and asymptotically minimax
- Statistics for high-dimensional data. Methods, theory and applications.
- Oracle inequalities in empirical risk minimization and sparse recovery problems. École d'Été de Probabilités de Saint-Flour XXXVIII-2008.
- Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion
- SLOPE-adaptive variable selection via convex optimization
- Adaptive estimation of a quadratic functional by model selection.
- Regularizers for structured sparsity
- Pivotal estimation via square-root lasso in nonparametric regression
- Simultaneous analysis of Lasso and Dantzig selector
- Noisy low-rank matrix completion with general sampling distribution
- Aggregation for Gaussian regression
- Optimization with Sparsity-Inducing Penalties
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- Scaled sparse linear regression
- Aggregation and Sparsity Via ℓ1 Penalized Least Squares
- The Group Square-Root Lasso: Theoretical Properties and Fast Algorithms
- Weakly decomposable regularization penalties and structured sparsity
- The Lasso, correlated design, and improved oracle inequalities
- Convex analysis and monotone operator theory in Hilbert spaces