Non-asymptotic bounds for the _ estimator in linear regression with uniform noise
From MaRDI portal
Publication:6178575
DOI10.3150/23-BEJ1607arXiv2108.07630OpenAlexW4388513456MaRDI QIDQ6178575FDOQ6178575
Authors: Yufei Yi, Matey Neykov
Publication date: 16 January 2024
Published in: Bernoulli (Search for Journal in Brave)
Abstract: The Chebyshev or estimator is an unconventional alternative to the ordinary least squares in solving linear regressions. It is defined as the minimizer of the objective function �egin{align*} hat{�oldsymbol{�eta}} := argmin_{�oldsymbol{�eta}} |�oldsymbol{Y} - mathbf{X}�oldsymbol{�eta}|_{infty}. end{align*} The asymptotic distribution of the Chebyshev estimator under fixed number of covariates was recently studied (Knight, 2020), yet finite sample guarantees and generalizations to high-dimensional settings remain open. In this paper, we develop non-asymptotic upper bounds on the estimation error for a Chebyshev estimator , in a regression setting with uniformly distributed noise where is either known or unknown. With relatively mild assumptions on the (random) design matrix , we can bound the error rate by with high probability, for some constant depending on the dimension and the law of the design. Furthermore, we illustrate that there exist designs for which the Chebyshev estimator is (nearly) minimax optimal. On the other hand we also argue that there exist designs for which this estimator behaves sub-optimally in terms of the constant 's dependence on . In addition we show that "Chebyshev's LASSO" has advantages over the regular LASSO in high dimensional situations, provided that the noise is uniform. Specifically, we argue that it achieves a much faster rate of estimation under certain assumptions on the growth rate of the sparsity level and the ambient dimension with respect to the sample size.
Full work available at URL: https://arxiv.org/abs/2108.07630
Cites Work
- Simultaneous analysis of Lasso and Dantzig selector
- High-Dimensional Statistics
- Using the least squares estimator in Chebyshev estimation
- Absolute continuity for some one-dimensional processes
- Lower bounds on the smallest eigenvalue of a sample covariance matrix.
- Bounding the Smallest Singular Value of a Random Matrix Without Concentration
- Covariance estimation for distributions with \({2+\varepsilon}\) moments
- Spectral norm of products of random and deterministic matrices
- Non-asymptotic theory of random matrices: extreme singular values
- Rate of Convergence of Lawson's Algorithm
- On L1 and Chebyshev estimation
- Estimating the parameters in regression with uniformly distributed errors
- Empirical processes with a bounded \(\psi_1\) diameter
- Coherent risk measures and good-deal bounds
- Regularization in Regression with Bounded Noise: A Chebyshev Center Approach
- Estimation theory and uncertainty intervals evaluation in presence of unknown but bounded errors: Linear families of models and estimators
- Optimal asymptotic identification under bounded disturbances
- Title not available (Why is that?)
- Approximating the centroid is hard
- Two Linear Programming Algorithms for Unbiased Estimation of Linear Models
- On the choice of norms in system identification
- On lower bounds for tail probabilities
- Arbitrage bounds for the term structure of interest rates
- Least absolute value and chebychev estimation utilizing least squares results
- A dual method for discrete Chebychev curve fitting
- Combined regression models
- Universal inference
- Exact minimax risk for linear least squares, and the lower tail of sample covariance matrices
- On the geometry of polytopes generated by heavy-tailed random vectors
- Minimum risk equivariant estimator in linear regression model
This page was built for publication: Non-asymptotic bounds for the \(\ell_{\infty}\) estimator in linear regression with uniform noise
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6178575)