Recursive ridge regression using second-order stochastic algorithms
From MaRDI portal
Publication:6071711
DOI10.1016/j.csda.2023.107854OpenAlexW4367832610MaRDI QIDQ6071711
Wei Lu, Bruno Portier, Antoine Godichon-Baggioni
Publication date: 28 November 2023
Published in: Computational Statistics and Data Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.csda.2023.107854
stochastic optimizationridge regressionmachine learningrecursive estimationstochastic Newton algorithm
Cites Work
- Unnamed Item
- Unnamed Item
- Minimizing finite sums with the stochastic average gradient
- On the asymptotic rate of convergence of stochastic Newton algorithms and their weighted averaged versions
- Efficient training of neural nets for nonlinear adaptive filtering using a recursive Levenberg-Marquardt algorithm
- Generalized Cross-Validation as a Method for Choosing a Good Ridge Parameter
- Ridge Regression in Practice
- Competitive On-line Statistics
- Ridge Estimators in Logistic Regression
- An Efficient Stochastic Newton Algorithm for Parameter Estimation in Logistic Regressions
- Kernel Ridge Regression
- Ridge Regression: Biased Estimation for Nonorthogonal Problems
- A Stochastic Approximation Method
This page was built for publication: Recursive ridge regression using second-order stochastic algorithms