Random design analysis of ridge regression
From MaRDI portal
Publication:404306
DOI10.1007/S10208-014-9192-1zbMath1298.62120arXiv1106.2363OpenAlexW2054434031MaRDI QIDQ404306
Daniel Hsu, Sham M. Kakade, Tong Zhang
Publication date: 4 September 2014
Published in: Foundations of Computational Mathematics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1106.2363
Related Items (32)
High-resolution signal recovery via generalized sampling and functional principal component analysis ⋮ Adaptive estimation in multivariate response regression with hidden variables ⋮ Nonparametric stochastic approximation with large step-sizes ⋮ Unnamed Item ⋮ Folded concave penalized sparse linear regression: sparsity, statistical performance, and algorithmic theory for local solutions ⋮ Learning theory of distributed spectral algorithms ⋮ High-Dimensional Factor Regression for Heterogeneous Subpopulations ⋮ Rates of Bootstrap Approximation for Eigenvalues in High-Dimensional PCA ⋮ Unnamed Item ⋮ Risk bounds when learning infinitely many response functions by ordinary linear regression ⋮ Robust regression using biased objectives ⋮ Recovery of partly sparse and dense signals ⋮ Finite-sample analysis of \(M\)-estimators using self-concordance ⋮ From Fixed-X to Random-X Regression: Bias-Variance Decompositions, Covariance Penalties, and Prediction Error Estimation ⋮ Finite sample performance of linear least squares estimation ⋮ Unnamed Item ⋮ High-dimensional asymptotics of prediction: ridge regression and classification ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Bootstrapping and sample splitting for high-dimensional, assumption-lean inference ⋮ Adaptive metric dimensionality reduction ⋮ Finite impulse response models: a non-asymptotic analysis of the least squares estimator ⋮ Optimal rates for spectral algorithms with least-squares regression over Hilbert spaces ⋮ High-dimensional linear models: a random matrix perspective ⋮ Control variate selection for Monte Carlo integration ⋮ Harder, Better, Faster, Stronger Convergence Rates for Least-Squares Regression ⋮ An elementary analysis of ridge regression with random design ⋮ Unnamed Item ⋮ Exact minimax risk for linear least squares, and the lower tail of sample covariance matrices ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Suboptimality of constrained least squares and improvements via non-linear predictors
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Tail inequalities for sums of random matrices that depend on the intrinsic dimension
- Faster least squares approximation
- Robust linear least squares regression
- A tail inequality for quadratic forms of subgaussian random vectors
- Optimal global rates of convergence for nonparametric regression
- Adaptive estimation of a quadratic functional by model selection.
- Statistical learning theory and stochastic optimization. Ecole d'Eté de Probabilitiés de Saint-Flour XXXI -- 2001.
- Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder)
- Optimal rates for the regularized least-squares algorithm
- Learning theory estimates via integral operators and their approximations
- A fast randomized algorithm for overdetermined linear least-squares regression
- Matrix Analysis
- The Fast Johnson–Lindenstrauss Transform and Approximate Nearest Neighbors
- Learning Bounds for Kernel Regression Using Effective Data Dimensionality
This page was built for publication: Random design analysis of ridge regression