Tikhonov, Ivanov and Morozov regularization for support vector machine learning
From MaRDI portal
Publication:285946
DOI10.1007/s10994-015-5540-xzbMath1357.68179OpenAlexW2210716609WikidataQ58778430 ScholiaQ58778430MaRDI QIDQ285946
Sandro Ridella, Luca Oneto, Davide Anguita
Publication date: 19 May 2016
Published in: Machine Learning (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10994-015-5540-x
Tikhonov regularizationsupport vector machinestructural risk minimizationIvanov regularizationMorozov regularization
Related Items
Ivanov-Regularised Least-Squares Estimators over Large RKHSs and Their Interpolation Spaces ⋮ On iteration complexity of a first-order primal-dual method for nonlinear convex cone programming ⋮ Grouped Transformations and Regularization in High-Dimensional Explainable ANOVA Approximation ⋮ Support vector machine with Dirichlet feature mapping ⋮ New method for solving Ivanov regularization-based support vector machine learning ⋮ Binary classification SVM-based algorithms with interval-valued training data using triangular and Epanechnikov kernels ⋮ Unnamed Item ⋮ Riemannian optimization on unit sphere with \(p\)-norm and its applications ⋮ Improving SVM classification on imbalanced datasets by introducing a new bias ⋮ The Goldenshluger-Lepski method for constrained least-squares estimators over RKHSs ⋮ Regularisation of neural networks by enforcing Lipschitz continuity ⋮ Relaxation algorithms for matrix completion, with applications to seismic travel-time data interpolation ⋮ On a regularization of unsupervised domain adaptation in RKHS ⋮ Unnamed Item ⋮ Data-Driven Optimization: A Reproducing Kernel Hilbert Space Approach
Uses Software
Cites Work
- An improved analysis of the Rademacher data-dependent bound using its self bounding property
- Leave one out error, stability, and generalization of voting combinations of classifiers
- Support-vector networks
- Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder)
- Local Rademacher complexities
- Discrete Mathematics of Neural Networks
- The best constants in the Khintchine inequality
- Optimization of lipschitz continuous functions
- Von Neumann stability analysis of Biot's general two-dimensional theory of consolidation
- The sample complexity of pattern classification with neural networks: the size of the weights is more important than the size of the network
- The Concave-Convex Procedure
- Chaos control using least-squares support vector machines
- Rademacher penalties and structural risk minimization
- Structural risk minimization over data-dependent hierarchies
- The importance of convexity in learning with squared loss
- On solving a linear program with one quadratic constraint
- 10.1162/153244302760200704
- Improvements to Platt's SMO Algorithm for SVM Classifier Design
- Asymptotic Behaviors of Support Vector Machines with Gaussian Kernel
- 10.1162/1532443041424337
- 10.1162/153244303321897690
- Agnostic Learning of Monomials by Halfspaces Is Hard
- L 1-Regularization Path Algorithm for Generalized Linear Models
- Advanced Lectures on Machine Learning
- Branch-and-Bound Methods: A Survey
- Convexity, Classification, and Risk Bounds
- Theory of Reproducing Kernels
- Convex analysis and monotone operator theory in Hilbert spaces
- Logistic regression, AdaBoost and Bregman distances
- Convergence of a generalized SMO algorithm for SVM classifier design
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item