Non-asymptotic bounds for the _ estimator in linear regression with uniform noise
From MaRDI portal
Publication:6178575
Abstract: The Chebyshev or estimator is an unconventional alternative to the ordinary least squares in solving linear regressions. It is defined as the minimizer of the objective function �egin{align*} hat{�oldsymbol{�eta}} := argmin_{�oldsymbol{�eta}} |�oldsymbol{Y} - mathbf{X}�oldsymbol{�eta}|_{infty}. end{align*} The asymptotic distribution of the Chebyshev estimator under fixed number of covariates was recently studied (Knight, 2020), yet finite sample guarantees and generalizations to high-dimensional settings remain open. In this paper, we develop non-asymptotic upper bounds on the estimation error for a Chebyshev estimator , in a regression setting with uniformly distributed noise where is either known or unknown. With relatively mild assumptions on the (random) design matrix , we can bound the error rate by with high probability, for some constant depending on the dimension and the law of the design. Furthermore, we illustrate that there exist designs for which the Chebyshev estimator is (nearly) minimax optimal. On the other hand we also argue that there exist designs for which this estimator behaves sub-optimally in terms of the constant 's dependence on . In addition we show that "Chebyshev's LASSO" has advantages over the regular LASSO in high dimensional situations, provided that the noise is uniform. Specifically, we argue that it achieves a much faster rate of estimation under certain assumptions on the growth rate of the sparsity level and the ambient dimension with respect to the sample size.
Cites work
- scientific article; zbMATH DE number 4113780 (Why is no real title available?)
- A dual method for discrete Chebychev curve fitting
- Absolute continuity for some one-dimensional processes
- Approximating the centroid is hard
- Arbitrage bounds for the term structure of interest rates
- Bounding the smallest singular value of a random matrix without concentration
- Coherent risk measures and good-deal bounds
- Combined regression models
- Covariance estimation for distributions with \({2+\varepsilon}\) moments
- Empirical processes with a bounded \(\psi_1\) diameter
- Estimating the parameters in regression with uniformly distributed errors
- Estimation theory and uncertainty intervals evaluation in presence of unknown but bounded errors: Linear families of models and estimators
- Exact minimax risk for linear least squares, and the lower tail of sample covariance matrices
- High-dimensional statistics. A non-asymptotic viewpoint
- Least absolute value and chebychev estimation utilizing least squares results
- Lower bounds on the smallest eigenvalue of a sample covariance matrix.
- Minimum risk equivariant estimator in linear regression model
- Non-asymptotic theory of random matrices: extreme singular values
- On L1 and Chebyshev estimation
- On lower bounds for tail probabilities
- On the choice of norms in system identification
- On the geometry of polytopes generated by heavy-tailed random vectors
- Optimal asymptotic identification under bounded disturbances
- Rate of Convergence of Lawson's Algorithm
- Regularization in Regression with Bounded Noise: A Chebyshev Center Approach
- Simultaneous analysis of Lasso and Dantzig selector
- Spectral norm of products of random and deterministic matrices
- Two Linear Programming Algorithms for Unbiased Estimation of Linear Models
- Universal inference
- Using the least squares estimator in Chebyshev estimation
This page was built for publication: Non-asymptotic bounds for the \(\ell_{\infty}\) estimator in linear regression with uniform noise
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6178575)