Random design analysis of ridge regression
From MaRDI portal
Publication:404306
DOI10.1007/S10208-014-9192-1zbMATH Open1298.62120arXiv1106.2363OpenAlexW2054434031MaRDI QIDQ404306FDOQ404306
Daniel Hsu, Sham M. Kakade, Tong Zhang
Publication date: 4 September 2014
Published in: Foundations of Computational Mathematics (Search for Journal in Brave)
Abstract: This work gives a simultaneous analysis of both the ordinary least squares estimator and the ridge regression estimator in the random design setting under mild assumptions on the covariate/response distributions. In particular, the analysis provides sharp results on the ``out-of-sample prediction error, as opposed to the ``in-sample (fixed design) error. The analysis also reveals the effect of errors in the estimated covariance structure, as well as the effect of modeling errors, neither of which effects are present in the fixed design setting. The proofs of the main results are based on a simple decomposition lemma combined with concentration inequalities for random vectors and matrices.
Full work available at URL: https://arxiv.org/abs/1106.2363
Recommendations
- An elementary analysis of ridge regression with random design
- scientific article; zbMATH DE number 418915
- Ridge rerandomization: an experimental design strategy in the presence of covariate collinearity
- scientific article
- A note on a commonly used ridge regression Monte Carlo design
- Ridge Regression – A Simulation Study
- Ridge regression revisited
- Regression with random design: a minimax study
- Random-design regression under long-range dependent errors
Cites Work
- Matrix Analysis
- Title not available (Why is that?)
- Title not available (Why is that?)
- Optimal global rates of convergence for nonparametric regression
- Loss minimization and parameter estimation with heavy tails
- Robust linear least squares regression
- A tail inequality for quadratic forms of subgaussian random vectors
- Adaptive estimation of a quadratic functional by model selection.
- Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder)
- Learning Bounds for Kernel Regression Using Effective Data Dimensionality
- Faster least squares approximation
- Optimal rates for the regularized least-squares algorithm
- Statistical learning theory and stochastic optimization. Ecole d'Eté de Probabilitiés de Saint-Flour XXXI -- 2001.
- Learning theory estimates via integral operators and their approximations
- A fast randomized algorithm for overdetermined linear least-squares regression
- Tail inequalities for sums of random matrices that depend on the intrinsic dimension
- The Fast Johnson–Lindenstrauss Transform and Approximate Nearest Neighbors
Cited In (39)
- High-resolution signal recovery via generalized sampling and functional principal component analysis
- Title not available (Why is that?)
- Robust regression using biased objectives
- Recovery of partly sparse and dense signals
- Finite-sample analysis of \(M\)-estimators using self-concordance
- Suboptimality of constrained least squares and improvements via non-linear predictors
- Learning theory of distributed spectral algorithms
- Bootstrapping and sample splitting for high-dimensional, assumption-lean inference
- Adaptive metric dimensionality reduction
- High-dimensional asymptotics of prediction: ridge regression and classification
- Finite sample performance of linear least squares estimation
- Convergence guarantees for forward gradient descent in the linear regression model
- Folded concave penalized sparse linear regression: sparsity, statistical performance, and algorithmic theory for local solutions
- Convergence of entropy-regularized natural policy gradient with linear function approximation
- "Ridge Analysis" of Response Surfaces
- High-Dimensional Factor Regression for Heterogeneous Subpopulations
- Optimal rates for spectral algorithms with least-squares regression over Hilbert spaces
- Adaptive estimation in multivariate response regression with hidden variables
- Title not available (Why is that?)
- Title not available (Why is that?)
- An elementary analysis of ridge regression with random design
- Regression Analysis with a Stochastic Design Variable
- Title not available (Why is that?)
- Title not available (Why is that?)
- New equivalences between interpolation and SVMs: kernels and structured features
- Rates of Bootstrap Approximation for Eigenvalues in High-Dimensional PCA
- On the asymptotic risk of ridge regression with many predictors
- Finite impulse response models: a non-asymptotic analysis of the least squares estimator
- Title not available (Why is that?)
- Inference in High-Dimensional Multivariate Response Regression with Hidden Variables
- Control variate selection for Monte Carlo integration
- High-dimensional linear models: a random matrix perspective
- Nonparametric stochastic approximation with large step-sizes
- Risk bounds when learning infinitely many response functions by ordinary linear regression
- Title not available (Why is that?)
- Title not available (Why is that?)
- Exact minimax risk for linear least squares, and the lower tail of sample covariance matrices
- From Fixed-X to Random-X Regression: Bias-Variance Decompositions, Covariance Penalties, and Prediction Error Estimation
- Harder, Better, Faster, Stronger Convergence Rates for Least-Squares Regression
This page was built for publication: Random design analysis of ridge regression
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q404306)