An Efficient Stochastic Newton Algorithm for Parameter Estimation in Logistic Regressions
From MaRDI portal
Publication:5212017
DOI10.1137/19M1261717zbMath1435.62285arXiv1904.07908OpenAlexW2939929204WikidataQ126304926 ScholiaQ126304926MaRDI QIDQ5212017
Bruno Portier, Antoine Godichon, Bernard Bercu
Publication date: 24 January 2020
Published in: SIAM Journal on Control and Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1904.07908
Time series, auto-correlation, regression, etc. in statistics (GARCH) (62M10) Point estimation (62F10) Generalized linear models (logistic models) (62J12) Numerical optimization and variational techniques (65K10) Sequential estimation (62L12)
Related Items
Inversion-free subsampling Newton's method for large sample logistic regression, Recursive ridge regression using second-order stochastic algorithms, On the asymptotic rate of convergence of stochastic Newton algorithms and their weighted averaged versions, Unnamed Item
Cites Work
- Unnamed Item
- Unnamed Item
- A Stochastic Quasi-Newton Method for Large-Scale Optimization
- System identification with quantized observations
- Online estimation of the asymptotic variance for averaged stochastic gradient algorithms
- Optimal non-asymptotic analysis of the Ruppert-Polyak averaging stochastic algorithm
- Adaptivity of averaged stochastic gradient descent to local strong convexity for logistic regression
- Acceleration of Stochastic Approximation by Averaging
- Asymptotic Almost Sure Efficiency of Averaged Stochastic Algorithms
- RES: Regularized Stochastic BFGS Algorithm
- Applied Logistic Regression
- Optimal survey schemes for stochastic gradient descent with applications to M-estimation
- A Stochastic Approximation Method
- Lp and almost sure rates of convergence of averaged stochastic gradient algorithms: locally strongly convex objective