Robustness of learning algorithms using hinge loss with outlier indicators
From MaRDI portal
Publication:2292231
DOI10.1016/j.neunet.2017.07.005zbMath1429.68220OpenAlexW2739299529WikidataQ38629980 ScholiaQ38629980MaRDI QIDQ2292231
Shuhei Fujiwara, Takafumi Kanamori, Akiko Takeda
Publication date: 3 February 2020
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.neunet.2017.07.005
Related Items
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On consistency and robustness properties of support vector machines for heavy-tailed distributions
- Convex analysis and nonlinear optimization. Theory and examples.
- A decision-theoretic generalization of on-line learning and an application to boosting
- Convex analysis approach to d. c. programming: Theory, algorithms and applications
- The DC (Difference of convex functions) programming and DCA revisited with DC models of real world nonconvex optimization problems
- Support vector machines are universally consistent
- Statistical behavior and consistency of classification methods based on convex risk minimization.
- Optimal aggregation of classifiers in statistical learning.
- Support-vector networks
- Random classification noise defeats all convex potential boosters
- An evolutionary algorithm for robust regression
- Estimating the Support of a High-Dimensional Distribution
- A Proof of Convergence of the Concave-Convex Procedure Using Zangwill's Theory
- Support Vector Machines
- Consistency of Support Vector Machines and Other Regularized Kernel Classifiers
- Robust Truncated Hinge Loss Support Vector Machines
- On ψ-Learning
- Robust Statistics
- Robust Estimation of a Location Parameter
- Convexity, Classification, and Risk Bounds
- Multicategory ψ-Learning
- Robust Statistics