A new perspective on least squares under convex constraint
From MaRDI portal
Publication:482891
DOI10.1214/14-AOS1254zbMath1302.62053arXiv1402.0830MaRDI QIDQ482891
Publication date: 6 January 2015
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1402.0830
Asymptotic properties of parametric estimators (62F12) Nonparametric regression and quantile regression (62G08) Point estimation (62F10) Parametric inference under constraints (62F30)
Related Items
High-dimensional asymptotics of likelihood ratio tests in the Gaussian sequence model under convex constraints ⋮ An improved global risk bound in concave regression ⋮ Adaptation in multivariate log-concave density estimation ⋮ The phase transition for the existence of the maximum likelihood estimate in high-dimensional logistic regression ⋮ Slope heuristics and V-Fold model selection in heteroscedastic regression using strongly localized bases ⋮ Estimating piecewise monotone signals ⋮ The limiting behavior of isotonic and convex regression estimators when the model is misspecified ⋮ On concentration for (regularized) empirical risk minimization ⋮ Discussion of ``On concentration for (regularized) empirical risk minimization by Sara van de Geer and Martin Wainwright ⋮ Discussion of the paper ``On concentration for (regularized) empirical risk minimization ⋮ Concentration behavior of the penalized least squares estimator ⋮ Noisy linear inverse problems under convex constraints: exact risk asymptotics in high dimensions ⋮ From Gauss to Kolmogorov: localized measures of complexity for ellipses ⋮ High-dimensional estimation with geometric constraints: Table 1. ⋮ Estimation of Monge matrices ⋮ Adaptive risk bounds in unimodal regression ⋮ Optimal rates of statistical seriation ⋮ Nonparametric shape-restricted regression ⋮ Bayesian fractional posteriors ⋮ The geometry of hypothesis testing over convex cones: generalized likelihood ratio tests and minimax radii ⋮ Isotonic regression meets Lasso ⋮ Oracle inequalities for high-dimensional prediction ⋮ Sharp oracle inequalities for least squares estimators in shape restricted regression ⋮ On the sensitivity of the Lasso to the number of predictor variables ⋮ Estimation in High Dimensions: A Geometric Perspective ⋮ Estimating a density, a hazard rate, and a transition intensity via the \(\rho\)-estimation method ⋮ Inverse Optimization with Noisy Data ⋮ Set structured global empirical risk minimizers are rate optimal in general dimensions ⋮ Hypothesis testing for densities and high-dimensional multinomials: sharp local minimax rates ⋮ Localized Gaussian width of \(M\)-convex hulls with applications to Lasso and convex aggregation ⋮ Isotonic regression in general dimensions ⋮ Stratified incomplete local simplex tests for curvature of nonparametric multiple regression ⋮ Suboptimality of constrained least squares and improvements via non-linear predictors
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- The Adaptive Lasso and Its Oracle Properties
- Sharp MSE bounds for proximal denoising
- Degrees of freedom in lasso problems
- Adaptivity and optimality of the monotone least-squares estimator
- Statistics for high-dimensional data. Methods, theory and applications.
- Exponential screening and optimal rates of sparse estimation
- A new approach to least-squares estimation, with applications
- The Dantzig selector and sparsity oracle inequalities
- From Steiner formulas for cones to concentration of intrinsic volumes
- Concentration inequalities and model selection. Ecole d'Eté de Probabilités de Saint-Flour XXXIII -- 2003.
- Estimating a regression function
- Lasso-type recovery of sparse representations for high-dimensional data
- Asymptotic normality of statistics based on the convex minorants of empirical distribution functions
- The Brunn-Minkowski inequality in Gauss space
- Extremal properties of half-spaces for spherically invariant measures
- Rates of convergence for minimum contrast estimators
- Consistency for the least squares estimator in nonparametric regression
- Sharp asymptotics for isotonic regression
- Persistene in high-dimensional linear predictor-selection and the virtue of overparametrization
- On the degrees of freedom in shape-restricted regression.
- Asymptotics for Lasso-type estimators.
- Risk bounds in isotonic regression
- Least angle regression. (With discussion)
- Weak convergence and empirical processes. With applications to statistics
- The convex geometry of linear inverse problems
- \(\ell _{1}\)-regularized linear regression: persistence and oracle inequalities
- Convergence of linear functionals of the Grenander estimator under misspecification
- Hellinger-consistency of certain nonparametric maximum likelihood estimators
- Simultaneous analysis of Lasso and Dantzig selector
- High-dimensional generalized linear models and the lasso
- Sparsity oracle inequalities for the Lasso
- On the ``degrees of freedom of the lasso
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- High-dimensional graphs and variable selection with the Lasso
- On risk bounds in isotonic and other shape restricted regression problems
- The sizes of compact subsets of Hilbert space and continuity of Gaussian processes
- An Empirical Distribution Function for Sampling with Incomplete Information
- On sparse reconstruction from Fourier and Gaussian measurements
- Decoding by Linear Programming
- Stable recovery of sparse overcomplete representations in the presence of noise
- Unified LASSO Estimation by Least Squares Approximation
- Ideal spatial adaptation by wavelet shrinkage
- Uncertainty principles and ideal atomic decomposition
- Asymptotic behavior of the grenander estimator at density flat regions
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Computational and statistical tradeoffs via convex relaxation
- Living on the edge: phase transitions in convex programs with random data
- The L2risk of an isotonic estimate
- Corrupted Sensing: Novel Guarantees for Separating Structured Signals
- Optimally sparse representation in general (nonorthogonal) dictionaries via ℓ 1 minimization
- For most large underdetermined systems of equations, the minimal 𝓁1‐norm near‐solution approximates the sparsest near‐solution
- The Lasso, correlated design, and improved oracle inequalities
- Probability
- Convergence of stochastic processes