Variable regularized least-squares algorithm: one-step-ahead cost function with equivalent optimality
From MaRDI portal
Publication:548900
DOI10.1016/J.SIGPRO.2010.12.004zbMATH Open1219.94019OpenAlexW2053292491MaRDI QIDQ548900FDOQ548900
Authors: NamWoong Kong, Moon-Soo Chang, Poogyeon Park
Publication date: 30 June 2011
Published in: Signal Processing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.sigpro.2010.12.004
Recommendations
- Convergence and consistency of recursive least squares with variable-rate forgetting
- Regularized fast recursive least squares algorithms for finite memory filtering
- A variable-step-size NLMS algorithm using statistics of channel response
- Variable step-size normalized LMS algorithm by approximating correlation matrix of estimation error
- Gradient based variable forgetting factor RLS algorithm
Signal theory (characterization, reconstruction, filtering, etc.) (94A12) System identification (93B30)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Displacement Structure: Theory and Applications
- Array algorithms for H/sup ∞/ estimation
- Numerically stable fast transversal filters for recursive least squares adaptive filtering
- A noise resilient variable step-size LMS algorithm
- Fast, recursive-least-squares transversal filters for adaptive filtering
This page was built for publication: Variable regularized least-squares algorithm: one-step-ahead cost function with equivalent optimality
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q548900)