scientific article

From MaRDI portal
Publication:3093282

zbMath1222.68180MaRDI QIDQ3093282

Umberto de Giovannini, Francesca Odone, Lorenzo Rosasco, Ernesto De Vito, Andrea Caponnetto

Publication date: 12 October 2011

Full work available at URL: http://www.jmlr.org/papers/v6/devito05a.html

Title: zbMATH Open Web Interface contents unavailable due to conflicting licenses.



Related Items

Machine learning with kernels for portfolio valuation and risk management, Complexity control in statistical learning, Feasibility-based fixed point networks, Multi-penalty regularization in learning theory, Two-Layer Neural Networks with Values in a Banach Space, Geometry on probability spaces, Learning from non-random data in Hilbert spaces: an optimal recovery perspective, Wasserstein-Based Projections with Applications to Inverse Problems, On regularization algorithms in learning theory, Manifold regularization based on Nyström type subsampling, Multi-task learning via linear functional strategy, On spectral windows in supervised learning from data, Error analysis on regularized regression based on the maximum correntropy criterion, Generalized Kalman smoothing: modeling and algorithms, Nyström type subsampling analyzed as a regularized projection, Learning regularization parameters for general-form Tikhonov, Learning theory of distributed spectral algorithms, Learning spectral windowing parameters for regularization using unbiased predictive risk and generalized cross validation techniques for multiple data sets, Optimal filters from calibration data for image deconvolution with data acquisition error, Spectral algorithms for learning with dependent observations, A note on the prediction error of principal component regression in high dimensions, Convergence Rates for Learning Linear Operators from Noisy Data, Convex regularization in statistical inverse learning problems, Unnamed Item, Regularized Nyström Subsampling in Covariate Shift Domain Adaptation Problems, Statistical performance of support vector machines, A Study on Regularization for Discrete Inverse Problems with Model-Dependent Noise, Representation and reconstruction of covariance operators in linear inverse problems, Multi-output learning via spectral filtering, Smoothed residual stopping for statistical inverse problems via truncated SVD estimation, Adaptive kernel methods using the balancing principle, A multiscale support vector regression method on spheres with data compression, Kernel methods in system identification, machine learning and function estimation: a survey, Mini-workshop: Deep learning and inverse problems. Abstracts from the mini-workshop held March 4--10, 2018, Kernel regression, minimax rates and effective dimensionality: Beyond the regular case, Optimal rates for regularization of statistical inverse learning problems, Kernel variable selection for multicategory support vector machines, Consistent learning by composite proximal thresholding, Efficient regularized least-squares algorithms for conditional ranking on relational data, Diffusion maps for changing data, Ensemble Kalman inversion: a derivative-free technique for machine learning tasks, Convergence Rates of Spectral Regularization Methods: A Comparison between Ill-Posed Inverse Problems and Statistical Kernel Learning, Random discretization of the finite Fourier transform and related kernel random matrices, Convergence rates of Kernel Conjugate Gradient for random design regression, Kernel partial least squares for stationary data, On a regularization of unsupervised domain adaptation in RKHS, An elementary analysis of ridge regression with random design, Regression learning based on incomplete relationships between attributes, Estimating adsorption isotherm parameters in chromatography via a virtual injection promoting double feed-forward neural network, Unnamed Item, A consistent and numerically efficient variable selection method for sparse Poisson regression with applications to learning and signal recovery, Unnamed Item, Distributed least squares prediction for functional linear regression*, Thresholded spectral algorithms for sparse approximations, Regularization: From Inverse Problems to Large-Scale Machine Learning