A new perspective on least squares under convex constraint
From MaRDI portal
Publication:482891
DOI10.1214/14-AOS1254zbMATH Open1302.62053arXiv1402.0830MaRDI QIDQ482891FDOQ482891
Authors: Sourav Chatterjee
Publication date: 6 January 2015
Published in: The Annals of Statistics (Search for Journal in Brave)
Abstract: Consider the problem of estimating the mean of a Gaussian random vector when the mean vector is assumed to be in a given convex set. The most natural solution is to take the Euclidean projection of the data vector on to this convex set; in other words, performing "least squares under a convex constraint." Many problems in modern statistics and statistical signal processing theory are special cases of this general situation. Examples include the lasso and other high-dimensional regression techniques, function estimation problems, matrix estimation and completion, shape-restricted regression, constrained denoising, linear inverse problems, etc. This paper presents three general results about this problem, namely, (a) an exact computation of the main term in the estimation error by relating it to expected maxima of Gaussian processes (existing results only give upper bounds), (b) a theorem showing that the least squares estimator is always admissible up to a universal constant in any problem of the above kind and (c) a counterexample showing that least squares estimator may not always be minimax rate-optimal. The result from part (a) is then used to compute the error of the least squares estimator in two examples of contemporary interest.
Full work available at URL: https://arxiv.org/abs/1402.0830
Recommendations
- On the risk of convex-constrained least squares estimators under misspecification
- On convex least squares estimation when the truth is linear
- A note on the approximate admissibility of regularized estimators in the Gaussian sequence model
- The least squares estimator of random variables under convex operators on \(L_{\mathcal{F}}^\infty (\mu)\) space
- Sharp oracle inequalities for least squares estimators in shape restricted regression
Point estimation (62F10) Asymptotic properties of parametric estimators (62F12) Nonparametric regression and quantile regression (62G08) Parametric inference under constraints (62F30)
Cites Work
- The Adaptive Lasso and Its Oracle Properties
- Least angle regression. (With discussion)
- Weak convergence and empirical processes. With applications to statistics
- Ideal spatial adaptation by wavelet shrinkage
- Title not available (Why is that?)
- Statistics for high-dimensional data. Methods, theory and applications.
- Lasso-type recovery of sparse representations for high-dimensional data
- Simultaneous analysis of Lasso and Dantzig selector
- High-dimensional generalized linear models and the lasso
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- High-dimensional graphs and variable selection with the Lasso
- Title not available (Why is that?)
- Title not available (Why is that?)
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Persistene in high-dimensional linear predictor-selection and the virtue of overparametrization
- On the degrees of freedom in shape-restricted regression.
- Asymptotics for Lasso-type estimators.
- Sparsity oracle inequalities for the Lasso
- Title not available (Why is that?)
- Unified LASSO Estimation by Least Squares Approximation
- Title not available (Why is that?)
- Degrees of freedom in lasso problems
- Optimally sparse representation in general (nonorthogonal) dictionaries via ℓ 1 minimization
- Convergence of stochastic processes
- Title not available (Why is that?)
- An Empirical Distribution Function for Sampling with Incomplete Information
- Decoding by Linear Programming
- Stable recovery of sparse overcomplete representations in the presence of noise
- Probability. Theory and examples.
- On the ``degrees of freedom of the lasso
- The concentration of measure phenomenon
- The Dantzig selector and sparsity oracle inequalities
- Concentration inequalities and model selection. Ecole d'Eté de Probabilités de Saint-Flour XXXIII -- 2003.
- Uncertainty principles and ideal atomic decomposition
- The Brunn-Minkowski inequality in Gauss space
- \(\ell _{1}\)-regularized linear regression: persistence and oracle inequalities
- Exponential screening and optimal rates of sparse estimation
- Title not available (Why is that?)
- On sparse reconstruction from Fourier and Gaussian measurements
- The convex geometry of linear inverse problems
- Living on the edge: phase transitions in convex programs with random data
- Rates of convergence for minimum contrast estimators
- Hellinger-consistency of certain nonparametric maximum likelihood estimators
- Title not available (Why is that?)
- Title not available (Why is that?)
- Risk bounds in isotonic regression
- On risk bounds in isotonic and other shape restricted regression problems
- The sizes of compact subsets of Hilbert space and continuity of Gaussian processes
- Estimating a regression function
- From Steiner formulas for cones to concentration of intrinsic volumes
- Sharp MSE bounds for proximal denoising
- Computational and statistical tradeoffs via convex relaxation
- Corrupted Sensing: Novel Guarantees for Separating Structured Signals
- Title not available (Why is that?)
- The Lasso, correlated design, and improved oracle inequalities
- For most large underdetermined systems of equations, the minimal 𝓁1‐norm near‐solution approximates the sparsest near‐solution
- Extremal properties of half-spaces for spherically invariant measures
- Consistency for the least squares estimator in nonparametric regression
- Sharp asymptotics for isotonic regression
- Adaptivity and optimality of the monotone least-squares estimator
- Asymptotic normality of statistics based on the convex minorants of empirical distribution functions
- Convergence of linear functionals of the Grenander estimator under misspecification
- Asymptotic behavior of the grenander estimator at density flat regions
- The L2risk of an isotonic estimate
- A new approach to least-squares estimation, with applications
Cited In (34)
- Bayesian fractional posteriors
- Localized Gaussian width of \(M\)-convex hulls with applications to Lasso and convex aggregation
- Nonparametric shape-restricted regression
- Stratified incomplete local simplex tests for curvature of nonparametric multiple regression
- Suboptimality of constrained least squares and improvements via non-linear predictors
- Estimation in High Dimensions: A Geometric Perspective
- High-dimensional asymptotics of likelihood ratio tests in the Gaussian sequence model under convex constraints
- Adaptation in multivariate log-concave density estimation
- Oracle inequalities for high-dimensional prediction
- Hypothesis testing for densities and high-dimensional multinomials: sharp local minimax rates
- Concentration behavior of the penalized least squares estimator
- Estimating piecewise monotone signals
- Isotonic regression in general dimensions
- On the constrained mock-Chebyshev least-squares
- An improved global risk bound in concave regression
- High-dimensional estimation with geometric constraints: Table 1.
- Noisy linear inverse problems under convex constraints: exact risk asymptotics in high dimensions
- Estimating a density, a hazard rate, and a transition intensity via the \(\rho\)-estimation method
- The phase transition for the existence of the maximum likelihood estimate in high-dimensional logistic regression
- On the sensitivity of the Lasso to the number of predictor variables
- Set structured global empirical risk minimizers are rate optimal in general dimensions
- On concentration for (regularized) empirical risk minimization
- Adaptive risk bounds in unimodal regression
- Discussion of ``On concentration for (regularized) empirical risk minimization by Sara van de Geer and Martin Wainwright
- Discussion of the paper ``On concentration for (regularized) empirical risk minimization
- Slope heuristics and V-Fold model selection in heteroscedastic regression using strongly localized bases
- The limiting behavior of isotonic and convex regression estimators when the model is misspecified
- Inverse Optimization with Noisy Data
- Sharp oracle inequalities for least squares estimators in shape restricted regression
- From Gauss to Kolmogorov: localized measures of complexity for ellipses
- The geometry of hypothesis testing over convex cones: generalized likelihood ratio tests and minimax radii
- Isotonic regression meets Lasso
- Optimal rates of statistical seriation
- Estimation of Monge matrices
Uses Software
This page was built for publication: A new perspective on least squares under convex constraint
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q482891)