Variable regularized least-squares algorithm: one-step-ahead cost function with equivalent optimality
From MaRDI portal
Recommendations
- Convergence and consistency of recursive least squares with variable-rate forgetting
- Regularized fast recursive least squares algorithms for finite memory filtering
- A variable-step-size NLMS algorithm using statistics of channel response
- Variable step-size normalized LMS algorithm by approximating correlation matrix of estimation error
- Gradient based variable forgetting factor RLS algorithm
Cites work
- scientific article; zbMATH DE number 44406 (Why is no real title available?)
- scientific article; zbMATH DE number 50447 (Why is no real title available?)
- A noise resilient variable step-size LMS algorithm
- Array algorithms for H/sup ∞/ estimation
- Displacement Structure: Theory and Applications
- Fast, recursive-least-squares transversal filters for adaptive filtering
- Numerically stable fast transversal filters for recursive least squares adaptive filtering
This page was built for publication: Variable regularized least-squares algorithm: one-step-ahead cost function with equivalent optimality
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q548900)