Analysis of regularized least squares for functional linear regression model
From MaRDI portal
Publication:1791683
DOI10.1016/j.jco.2018.08.001zbMath1402.62158OpenAlexW2887727090WikidataQ129404139 ScholiaQ129404139MaRDI QIDQ1791683
Publication date: 11 October 2018
Published in: Journal of Complexity (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jco.2018.08.001
Linear regression; mixed models (62J05) Learning and adaptive systems in artificial intelligence (68T05)
Related Items
Non-asymptotic error bound for optimal prediction of function-on-function regression by RKHS approach, Partially linear functional quantile regression in a reproducing kernel Hilbert space, Locally sparse quantile estimation for a partially functional interaction model, Optimal prediction for high-dimensional functional quantile regression in reproducing kernel Hilbert spaces, Partially functional linear regression with quadratic regularization, Convergence rates of support vector machines regression for functional data, Regression analysis of stochastic fatigue crack growth model in a martingale difference framework, Functional linear regression with Huber loss, Distributed least squares prediction for functional linear regression*
Cites Work
- Unnamed Item
- Functional linear regression analysis for longitudinal data
- A reproducing kernel Hilbert space approach to functional linear regression
- Prediction in functional linear regression
- Methodology and convergence rates for functional linear regression
- Support vector machines are universally consistent
- Optimum bounds for the distributions of martingales in Banach spaces
- Balancing principle in supervised learning for a general regularization scheme
- Optimal rates for the regularized least-squares algorithm
- Learning theory estimates via integral operators and their approximations
- 10.1162/153244302760185252
- An Introduction to the Theory of Reproducing Kernel Hilbert Spaces
- Minimax and Adaptive Prediction for Functional Linear Regression
- Learning theory of distributed spectral algorithms
- Learning Bounds for Kernel Regression Using Effective Data Dimensionality