Pages that link to "Item:Q1968640"
From MaRDI portal
The following pages link to Regularization networks and support vector machines (Q1968640):
Displayed 50 items.
- Incorporating prior knowledge in support vector regression (Q1009230) (← links)
- Nonparallel plane proximal classifier (Q1010181) (← links)
- A note on application of integral operator in learning theory (Q1012558) (← links)
- Kernel logistic PLS: a tool for supervised nonlinear dimensionality reduction and binary classifi\-cation (Q1020063) (← links)
- High-dimensional pseudo-logistic regression and classification with applications to gene expression data (Q1020833) (← links)
- Analysis of support vector machines regression (Q1022433) (← links)
- Scattered data reconstruction by regularization in B-spline and associated wavelet spaces (Q1034074) (← links)
- Learning rates of least-square regularized regression with polynomial kernels (Q1041518) (← links)
- High order Parzen windows and randomized sampling (Q1047130) (← links)
- Distribution-free consistency of empirical risk minimization and support vector regression (Q1047919) (← links)
- Learning a function from noisy samples at a finite sparse set of points (Q1048968) (← links)
- Robustness of reweighted least squares kernel based regression (Q1049548) (← links)
- Full-body person recognition system. (Q1400486) (← links)
- A note on different covering numbers in learning theory. (Q1426052) (← links)
- On the stability of reproducing kernel Hilbert spaces of discrete-time impulse responses (Q1626948) (← links)
- On kernel design for regularized LTI system identification (Q1640721) (← links)
- The generalized cross validation filter (Q1640724) (← links)
- Distributed regression learning with coefficient regularization (Q1645155) (← links)
- Optimal learning rates for kernel partial least squares (Q1645280) (← links)
- A computationally efficient scheme for feature extraction with kernel discriminant analysis (Q1669708) (← links)
- Calibration of \(\epsilon\)-insensitive loss in support vector machines regression (Q1730072) (← links)
- Mini-workshop: Deep learning and inverse problems. Abstracts from the mini-workshop held March 4--10, 2018 (Q1731979) (← links)
- Support vector machines regression with \(l^1\)-regularizer (Q1759352) (← links)
- Positive definite dot product kernels in learning theory (Q1767054) (← links)
- System identification using kernel-based regularization: new insights on stability and consistency issues (Q1797024) (← links)
- An approach for constructing complex discriminating surfaces based on Bayesian interference of the maximum entropy (Q1827331) (← links)
- The covering number in learning theory (Q1872632) (← links)
- Statistical properties of the method of regularization with periodic Gaussian reproducing kernel (Q1879972) (← links)
- Learning rates for least square regressions with coefficient regularization (Q1928153) (← links)
- A meta-learning approach to the regularized learning -- case study: blood glucose prediction (Q1941597) (← links)
- Laplacian twin support vector machine for semi-supervised classification (Q1942706) (← links)
- Explicit connections between longitudinal data analysis and kernel machines (Q1952003) (← links)
- Improving the solution of least squares support vector machines with application to a blast furnace system (Q1952998) (← links)
- Conditional quantiles with varying Gaussians (Q1955538) (← links)
- Knowledge-based Green's kernel for support vector regression (Q1958823) (← links)
- Adaptive kernel methods using the balancing principle (Q1959089) (← links)
- Generalized regularized least-squares approximation of noisy data with application to stochastic PDEs (Q2006304) (← links)
- SVM-boosting based on Markov resampling: theory and algorithm (Q2057733) (← links)
- On a regularization of unsupervised domain adaptation in RKHS (Q2075006) (← links)
- Robustness by reweighting for kernel estimators: an overview (Q2075710) (← links)
- Bayesian frequentist bounds for machine learning and system identification (Q2097759) (← links)
- Influence diagnostics in support vector machines (Q2131934) (← links)
- Efficient spatio-temporal Gaussian regression via Kalman filtering (Q2188275) (← links)
- Learning rates for the kernel regularized regression with a differentiable strongly convex loss (Q2191832) (← links)
- Approximation of Lyapunov functions from noisy data (Q2192453) (← links)
- Just interpolate: kernel ``ridgeless'' regression can generalize (Q2196223) (← links)
- A unifying representer theorem for inverse problems and machine learning (Q2231644) (← links)
- Learning performance of regularized regression with multiscale kernels based on Markov observations (Q2244161) (← links)
- Efficient regularized least-squares algorithms for conditional ranking on relational data (Q2251443) (← links)
- Convergence rates of learning algorithms by random projection (Q2252501) (← links)