Random design analysis of ridge regression
From MaRDI portal
Abstract: This work gives a simultaneous analysis of both the ordinary least squares estimator and the ridge regression estimator in the random design setting under mild assumptions on the covariate/response distributions. In particular, the analysis provides sharp results on the ``out-of-sample prediction error, as opposed to the ``in-sample (fixed design) error. The analysis also reveals the effect of errors in the estimated covariance structure, as well as the effect of modeling errors, neither of which effects are present in the fixed design setting. The proofs of the main results are based on a simple decomposition lemma combined with concentration inequalities for random vectors and matrices.
Recommendations
- An elementary analysis of ridge regression with random design
- scientific article; zbMATH DE number 418915
- Ridge rerandomization: an experimental design strategy in the presence of covariate collinearity
- scientific article; zbMATH DE number 4068102
- A note on a commonly used ridge regression Monte Carlo design
- Ridge Regression – A Simulation Study
- Ridge regression revisited
- Regression with random design: a minimax study
- Random-design regression under long-range dependent errors
Cites work
- scientific article; zbMATH DE number 47363 (Why is no real title available?)
- scientific article; zbMATH DE number 1220667 (Why is no real title available?)
- A fast randomized algorithm for overdetermined linear least-squares regression
- A tail inequality for quadratic forms of subgaussian random vectors
- Adaptive estimation of a quadratic functional by model selection.
- Faster least squares approximation
- Learning Bounds for Kernel Regression Using Effective Data Dimensionality
- Learning theory estimates via integral operators and their approximations
- Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder)
- Loss minimization and parameter estimation with heavy tails
- Matrix Analysis
- Optimal global rates of convergence for nonparametric regression
- Optimal rates for the regularized least-squares algorithm
- Robust linear least squares regression
- Statistical learning theory and stochastic optimization. Ecole d'Eté de Probabilitiés de Saint-Flour XXXI -- 2001.
- Tail inequalities for sums of random matrices that depend on the intrinsic dimension
- The fast Johnson-Lindenstrauss transform and approximate nearest neighbors
Cited in
(40)- Recovery of partly sparse and dense signals
- Adaptive metric dimensionality reduction
- scientific article; zbMATH DE number 7370615 (Why is no real title available?)
- Nonparametric stochastic approximation with large step-sizes
- On the asymptotic risk of ridge regression with many predictors
- Finite sample performance of linear least squares estimation
- Finite impulse response models: a non-asymptotic analysis of the least squares estimator
- Convergences of regularized algorithms and stochastic gradient methods with random projections
- High-Dimensional Factor Regression for Heterogeneous Subpopulations
- A risk comparison of ordinary least squares vs ridge regression
- Optimal rates for spectral algorithms with least-squares regression over Hilbert spaces
- Control variate selection for Monte Carlo integration
- High-dimensional linear models: a random matrix perspective
- High-resolution signal recovery via generalized sampling and functional principal component analysis
- Convergence guarantees for forward gradient descent in the linear regression model
- Regression Analysis with a Stochastic Design Variable
- scientific article; zbMATH DE number 7415120 (Why is no real title available?)
- Finite-sample analysis of \(M\)-estimators using self-concordance
- New equivalences between interpolation and SVMs: kernels and structured features
- Harder, Better, Faster, Stronger Convergence Rates for Least-Squares Regression
- Suboptimality of constrained least squares and improvements via non-linear predictors
- scientific article; zbMATH DE number 7625184 (Why is no real title available?)
- Risk bounds when learning infinitely many response functions by ordinary linear regression
- Robust regression using biased objectives
- Exact minimax risk for linear least squares, and the lower tail of sample covariance matrices
- Learning theory of distributed spectral algorithms
- Sketched ridge regression: optimization perspective, statistical perspective, and model averaging
- Adaptive estimation in multivariate response regression with hidden variables
- High-dimensional asymptotics of prediction: ridge regression and classification
- An elementary analysis of ridge regression with random design
- From Fixed-X to Random-X Regression: Bias-Variance Decompositions, Covariance Penalties, and Prediction Error Estimation
- scientific article; zbMATH DE number 7626738 (Why is no real title available?)
- Folded concave penalized sparse linear regression: sparsity, statistical performance, and algorithmic theory for local solutions
- Inference in High-Dimensional Multivariate Response Regression with Hidden Variables
- Bootstrapping and sample splitting for high-dimensional, assumption-lean inference
- scientific article; zbMATH DE number 7306853 (Why is no real title available?)
- Utilizing second order information in minibatch stochastic variance reduced proximal iterations
- Rates of Bootstrap Approximation for Eigenvalues in High-Dimensional PCA
- "Ridge Analysis" of Response Surfaces
- Convergence of entropy-regularized natural policy gradient with linear function approximation
This page was built for publication: Random design analysis of ridge regression
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q404306)