Random design analysis of ridge regression

From MaRDI portal
Revision as of 03:39, 30 January 2024 by Import240129110155 (talk | contribs) (Created automatically from import240129110155)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:404306

DOI10.1007/S10208-014-9192-1zbMath1298.62120arXiv1106.2363OpenAlexW2054434031MaRDI QIDQ404306

Daniel Hsu, Sham M. Kakade, Tong Zhang

Publication date: 4 September 2014

Published in: Foundations of Computational Mathematics (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1106.2363




Related Items (32)

High-resolution signal recovery via generalized sampling and functional principal component analysisAdaptive estimation in multivariate response regression with hidden variablesNonparametric stochastic approximation with large step-sizesUnnamed ItemFolded concave penalized sparse linear regression: sparsity, statistical performance, and algorithmic theory for local solutionsLearning theory of distributed spectral algorithmsHigh-Dimensional Factor Regression for Heterogeneous SubpopulationsRates of Bootstrap Approximation for Eigenvalues in High-Dimensional PCAUnnamed ItemRisk bounds when learning infinitely many response functions by ordinary linear regressionRobust regression using biased objectivesRecovery of partly sparse and dense signalsFinite-sample analysis of \(M\)-estimators using self-concordanceFrom Fixed-X to Random-X Regression: Bias-Variance Decompositions, Covariance Penalties, and Prediction Error EstimationFinite sample performance of linear least squares estimationUnnamed ItemHigh-dimensional asymptotics of prediction: ridge regression and classificationUnnamed ItemUnnamed ItemBootstrapping and sample splitting for high-dimensional, assumption-lean inferenceAdaptive metric dimensionality reductionFinite impulse response models: a non-asymptotic analysis of the least squares estimatorOptimal rates for spectral algorithms with least-squares regression over Hilbert spacesHigh-dimensional linear models: a random matrix perspectiveControl variate selection for Monte Carlo integrationHarder, Better, Faster, Stronger Convergence Rates for Least-Squares RegressionAn elementary analysis of ridge regression with random designUnnamed ItemExact minimax risk for linear least squares, and the lower tail of sample covariance matricesUnnamed ItemUnnamed ItemSuboptimality of constrained least squares and improvements via non-linear predictors




Cites Work




This page was built for publication: Random design analysis of ridge regression