Finite sample performance of linear least squares estimation
From MaRDI portal
Publication:2235406
Abstract: Linear Least Squares is a very well known technique for parameter estimation, which is used even when sub-optimal, because of its very low computational requirements and the fact that exact knowledge of the noise statistics is not required. Surprisingly, bounding the probability of large errors with finitely many samples has been left open, especially when dealing with correlated noise with unknown covariance. In this paper we analyze the finite sample performance of the linear least squares estimator. Using these bounds we obtain accurate bounds on the tail of the estimator's distribution. We show the fast exponential convergence of the number of samples required to ensure a given accuracy with high probability. We analyze a sub-Gaussian setting with a fixed or random design matrix of the linear least squares problem. We also extend the results to the case of a martingale difference noise sequence. Our analysis method is simple and uses simple type bounds on the estimation error. We also provide probabilistic finite sample bounds on the estimation error norm. The tightness of the bounds is tested through simulation. We demonstrate that our results are tighter than previously proposed bounds for norm of the error. The proposed bounds make it possible to predict the number of samples required for least squares estimation even when the least squares is sub-optimal and is used for computational simplicity.
Recommendations
- Error bounds for computed least squares estimators
- Finite impulse response models: a non-asymptotic analysis of the least squares estimator
- Loss minimization and parameter estimation with heavy tails
- A new perspective on least squares under convex constraint
- Finite sample properties of linear model identification
Cites work
- scientific article; zbMATH DE number 1158743 (Why is no real title available?)
- A General Class of Outage Error Probability Lower Bounds in Bayesian Parameter Estimation
- A concentration bound for stochastic approximation via Alekseev's formula
- A large deviation result for parameter estimators and its application to nonlinear regression analysis
- A large deviation result for the least squares estimators in nonlinear regression
- A note on strong consistency of least squares estimators in regression models with martingale difference errors
- A tail inequality for quadratic forms of subgaussian random vectors
- An improvement of convergence rate estimates in the Lyapunov theorem
- Asymptotic properties of general autoregressive models and strong consistency of least-squares estimates of their parameters
- Asymptotic theory of nonlinear least squares estimation
- Autoregressive Conditional Heteroscedasticity with Estimates of the Variance of United Kingdom Inflation
- Concentration of Measure for the Analysis of Randomized Algorithms
- Confidence sets in sparse regression
- Direction-of-arrival estimation for constant modulus signals.
- Finite Sample Efficiency of Ordinary Least Squares in the Linear Regression Model with Autocorrelated Errors
- High-dimensional statistics. A non-asymptotic viewpoint
- Inequalities for quantiles of the chi-square distribution
- Least squares estimates in stochastic regression models with applications to identification and control of dynamic systems
- Linear Least Squares Approach for Accurate Received Signal Strength Based Source Localization
- Non-parametric likelihood based channel estimator for Gaussian mixture noise
- On asymptotically optimal confidence regions and tests for high-dimensional models
- On non-asymptotic bounds for estimation in generalized linear models with highly correlated design
- On the Asymptotic Theory of Fixed-Size Sequential Confidence Bounds for Linear Regression Parameters
- On the exponential rate of convergence of the least squares estimator in the nonlinear regression model with Gaussian errors
- One-bit compressed sensing with non-Gaussian measurements
- Optimal rates for the regularized least-squares algorithm
- Random design analysis of ridge regression
- Recursive consistent estimation with bounded noise
- Robust Soft-Decision Interpolation Using Weighted Least Squares
- Robust linear least squares regression
- Sequential parameter estimation of time-varying non-Gaussian autoregressive processes
- Simultaneous analysis of Lasso and Dantzig selector
- Strong consistency of least squares estimators in linear regression models
- The lower tail of random quadratic forms with applications to ordinary least squares
- The rate of convergence of the least squares estimator in a non-linear regression model with dependent errors
- The sample complexity of learning linear predictors with the squared loss
- User-friendly tail bounds for sums of random matrices
- What is a Martingale?
- Wiener Filters in Gaussian Mixture Signal Estimation With $\ell _\infty $ -Norm Error
Cited in
(5)- Finite sample stability properties of the least median of squares estimator
- Performance Analysis of Linear-Equality-Constrained Least-Squares Estimation
- Data-driven fault detection for Lipschitz nonlinear systems: from open to closed-loop systems
- Finite impulse response models: a non-asymptotic analysis of the least squares estimator
- Analyzing the number of samples required for an approximate Monte-Carlo LMS line estimator
This page was built for publication: Finite sample performance of linear least squares estimation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2235406)