\(\ell _{1}\)-regularized linear regression: persistence and oracle inequalities
From MaRDI portal
Publication:1930861
DOI10.1007/s00440-011-0367-2zbMath1395.62207OpenAlexW2114529990WikidataQ105583438 ScholiaQ105583438MaRDI QIDQ1930861
Shahar Mendelson, Joseph Neeman, Bartlett, Peter L.
Publication date: 14 January 2013
Published in: Probability Theory and Related Fields (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s00440-011-0367-2
Nonparametric regression and quantile regression (62G08) Ridge regression; shrinkage estimators (Lasso) (62J07) Asymptotic properties of nonparametric inference (62G20)
Related Items
Greedy algorithms for prediction, Oracle inequalities for the Lasso in the high-dimensional Aalen multiplicative intensity model, Regularization in kernel learning, Empirical processes with a bounded \(\psi_1\) diameter, Non-negative least squares for high-dimensional linear models: consistency and sparse recovery without regularization, Sample average approximation with heavier tails. I: Non-asymptotic bounds with weak assumptions and stochastic constraints, Sample average approximation with heavier tails II: localization in stochastic convex optimization and persistence results for the Lasso, Non-asymptotic oracle inequalities for the Lasso and Group Lasso in high dimensional logistic model, On the optimality of the empirical risk minimization procedure for the convex aggregation problem, Robust regression using biased objectives, Empirical risk minimization is optimal for the convex aggregation problem, The Lasso as an \(\ell _{1}\)-ball model selection procedure, Kullback-Leibler aggregation and misspecified generalized linear models, General nonexact oracle inequalities for classes with a subexponential envelope, On the uniform convergence of empirical norms and inner products, with application to causal inference, A new perspective on least squares under convex constraint, \(L_1\)-penalization in functional linear regression with subgaussian design, Learning without Concentration, Concentration Inequalities for Statistical Inference, Oracle inequalities for weighted group Lasso in high-dimensional misspecified Cox models, Regularization and the small-ball method II: complexity dependent error rates, Localized Gaussian width of \(M\)-convex hulls with applications to Lasso and convex aggregation
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- General nonexact oracle inequalities for classes with a subexponential envelope
- Regularity of Gaussian processes
- Some limit theorems for empirical processes (with discussion)
- Some sharp performance bounds for least squares regression with \(L_1\) regularization
- Near-ideal model selection by \(\ell _{1}\) minimization
- Sparsity in penalized empirical risk minimization
- Regularization in kernel learning
- Concentration of mass on convex bodies
- Best subset selection, persistence in high-dimensional statistical learning and optimization under \(l_1\) constraint
- Subspaces and orthogonal decompositions generated by bounded orthogonal systems
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Majorizing measures and proportional subsets of bounded orthonormal systems
- Lasso-type recovery of sparse representations for high-dimensional data
- Inequalities of Bernstein-Jackson-type and the degree of compactness of operators in Banach spaces
- Global versus local asymptotic theories of finite-dimensional normed spaces
- Persistene in high-dimensional linear predictor-selection and the virtue of overparametrization
- Statistical learning theory and stochastic optimization. Ecole d'Eté de Probabilitiés de Saint-Flour XXXI -- 2001.
- Weak convergence and empirical processes. With applications to statistics
- Simultaneous analysis of Lasso and Dantzig selector
- High-dimensional generalized linear models and the lasso
- Sparsity oracle inequalities for the Lasso
- Sup-norm convergence rate and sign concentration property of Lasso and Dantzig estimators
- Gaussian averages of interpolated bodies and applications to approximate reconstruction
- Empirical minimization
- High-dimensional graphs and variable selection with the Lasso
- Stable recovery of sparse overcomplete representations in the presence of noise
- FAST RATES FOR ESTIMATION ERROR AND ORACLE INEQUALITIES FOR MODEL SELECTION
- Ideal spatial adaptation by wavelet shrinkage
- The Generic Chaining
- Improving the sample complexity using global data
- 10.1162/1532443041424337
- Learning Theory and Kernel Machines
- Aggregation and Sparsity Via ℓ1 Penalized Least Squares
- Probability Inequalities for Sums of Bounded Random Variables
- Convexity, Classification, and Risk Bounds