A new perspective on least squares under convex constraint
From MaRDI portal
Publication:482891
DOI10.1214/14-AOS1254zbMATH Open1302.62053MaRDI QIDQ482891FDOQ482891
Authors: Sourav Chatterjee
Publication date: 6 January 2015
Published in: The Annals of Statistics (Search for Journal in Brave)
Abstract: Consider the problem of estimating the mean of a Gaussian random vector when the mean vector is assumed to be in a given convex set. The most natural solution is to take the Euclidean projection of the data vector on to this convex set; in other words, performing "least squares under a convex constraint." Many problems in modern statistics and statistical signal processing theory are special cases of this general situation. Examples include the lasso and other high-dimensional regression techniques, function estimation problems, matrix estimation and completion, shape-restricted regression, constrained denoising, linear inverse problems, etc. This paper presents three general results about this problem, namely, (a) an exact computation of the main term in the estimation error by relating it to expected maxima of Gaussian processes (existing results only give upper bounds), (b) a theorem showing that the least squares estimator is always admissible up to a universal constant in any problem of the above kind and (c) a counterexample showing that least squares estimator may not always be minimax rate-optimal. The result from part (a) is then used to compute the error of the least squares estimator in two examples of contemporary interest.
Full work available at URL: https://arxiv.org/abs/1402.0830
Recommendations
- On the risk of convex-constrained least squares estimators under misspecification
- On convex least squares estimation when the truth is linear
- A note on the approximate admissibility of regularized estimators in the Gaussian sequence model
- The least squares estimator of random variables under convex operators on \(L_{\mathcal{F}}^\infty (\mu)\) space
- Sharp oracle inequalities for least squares estimators in shape restricted regression
Point estimation (62F10) Asymptotic properties of parametric estimators (62F12) Nonparametric regression and quantile regression (62G08) Parametric inference under constraints (62F30)
Cites Work
- scientific article; zbMATH DE number 5957408 (Why is no real title available?)
- scientific article; zbMATH DE number 3122730 (Why is no real title available?)
- scientific article; zbMATH DE number 3126936 (Why is no real title available?)
- scientific article; zbMATH DE number 5654889 (Why is no real title available?)
- scientific article; zbMATH DE number 3786012 (Why is no real title available?)
- scientific article; zbMATH DE number 193111 (Why is no real title available?)
- scientific article; zbMATH DE number 3560419 (Why is no real title available?)
- scientific article; zbMATH DE number 739533 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- scientific article; zbMATH DE number 3288992 (Why is no real title available?)
- A new approach to least-squares estimation, with applications
- Adaptivity and optimality of the monotone least-squares estimator
- An Empirical Distribution Function for Sampling with Incomplete Information
- Asymptotic behavior of the grenander estimator at density flat regions
- Asymptotic normality of statistics based on the convex minorants of empirical distribution functions
- Asymptotics for Lasso-type estimators.
- Computational and statistical tradeoffs via convex relaxation
- Concentration inequalities and model selection. Ecole d'Eté de Probabilités de Saint-Flour XXXIII -- 2003.
- Consistency for the least squares estimator in nonparametric regression
- Convergence of linear functionals of the Grenander estimator under misspecification
- Convergence of stochastic processes
- Corrupted Sensing: Novel Guarantees for Separating Structured Signals
- Decoding by Linear Programming
- Degrees of freedom in lasso problems
- Estimating a regression function
- Exponential screening and optimal rates of sparse estimation
- Extremal properties of half-spaces for spherically invariant measures
- For most large underdetermined systems of equations, the minimal 𝓁1‐norm near‐solution approximates the sparsest near‐solution
- From Steiner formulas for cones to concentration of intrinsic volumes
- Hellinger-consistency of certain nonparametric maximum likelihood estimators
- High-dimensional generalized linear models and the lasso
- High-dimensional graphs and variable selection with the Lasso
- Ideal spatial adaptation by wavelet shrinkage
- Lasso-type recovery of sparse representations for high-dimensional data
- Least angle regression. (With discussion)
- Living on the edge: phase transitions in convex programs with random data
- On risk bounds in isotonic and other shape restricted regression problems
- On sparse reconstruction from Fourier and Gaussian measurements
- On the ``degrees of freedom of the lasso
- On the degrees of freedom in shape-restricted regression.
- Optimally sparse representation in general (nonorthogonal) dictionaries via ℓ 1 minimization
- Persistene in high-dimensional linear predictor-selection and the virtue of overparametrization
- Probability. Theory and examples.
- Rates of convergence for minimum contrast estimators
- Risk bounds in isotonic regression
- Sharp MSE bounds for proximal denoising
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Sharp asymptotics for isotonic regression
- Simultaneous analysis of Lasso and Dantzig selector
- Sparsity oracle inequalities for the Lasso
- Stable recovery of sparse overcomplete representations in the presence of noise
- Statistics for high-dimensional data. Methods, theory and applications.
- The Adaptive Lasso and Its Oracle Properties
- The Brunn-Minkowski inequality in Gauss space
- The Dantzig selector and sparsity oracle inequalities
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- The L2risk of an isotonic estimate
- The Lasso, correlated design, and improved oracle inequalities
- The concentration of measure phenomenon
- The convex geometry of linear inverse problems
- The sizes of compact subsets of Hilbert space and continuity of Gaussian processes
- Uncertainty principles and ideal atomic decomposition
- Unified LASSO Estimation by Least Squares Approximation
- Weak convergence and empirical processes. With applications to statistics
- \(\ell _{1}\)-regularized linear regression: persistence and oracle inequalities
Cited In (39)
- Bayesian fractional posteriors
- Localized Gaussian width of \(M\)-convex hulls with applications to Lasso and convex aggregation
- Nonparametric shape-restricted regression
- Stratified incomplete local simplex tests for curvature of nonparametric multiple regression
- Suboptimality of constrained least squares and improvements via non-linear predictors
- High-dimensional asymptotics of likelihood ratio tests in the Gaussian sequence model under convex constraints
- Optimization landscape in the simplest constrained random least-square problem
- Adaptation in multivariate log-concave density estimation
- Finite sample performance of linear least squares estimation
- Inverse optimization with noisy data
- Oracle inequalities for high-dimensional prediction
- Slope heuristics and V-fold model selection in heteroscedastic regression using strongly localized bases
- Hypothesis testing for densities and high-dimensional multinomials: sharp local minimax rates
- Concentration behavior of the penalized least squares estimator
- Estimating piecewise monotone signals
- Isotonic regression in general dimensions
- On the constrained mock-Chebyshev least-squares
- An improved global risk bound in concave regression
- Noisy linear inverse problems under convex constraints: exact risk asymptotics in high dimensions
- Estimating a density, a hazard rate, and a transition intensity via the \(\rho\)-estimation method
- The phase transition for the existence of the maximum likelihood estimate in high-dimensional logistic regression
- Sharp MSE bounds for proximal denoising
- On the sensitivity of the Lasso to the number of predictor variables
- Set structured global empirical risk minimizers are rate optimal in general dimensions
- On concentration for (regularized) empirical risk minimization
- A note on the approximate admissibility of regularized estimators in the Gaussian sequence model
- Adaptive risk bounds in unimodal regression
- Discussion of ``On concentration for (regularized) empirical risk minimization by Sara van de Geer and Martin Wainwright
- Discussion of the paper ``On concentration for (regularized) empirical risk minimization
- The least squares estimator of random variables under convex operators on \(L_{\mathcal{F}}^\infty (\mu)\) space
- The limiting behavior of isotonic and convex regression estimators when the model is misspecified
- Sharp oracle inequalities for least squares estimators in shape restricted regression
- From Gauss to Kolmogorov: localized measures of complexity for ellipses
- Estimation in high dimensions: a geometric perspective
- High-dimensional estimation with geometric constraints
- The geometry of hypothesis testing over convex cones: generalized likelihood ratio tests and minimax radii
- Isotonic regression meets Lasso
- Optimal rates of statistical seriation
- Estimation of Monge matrices
Uses Software
This page was built for publication: A new perspective on least squares under convex constraint
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q482891)