A new convex objective function for the supervised learning of single-layer neural networks
From MaRDI portal
Publication:962791
DOI10.1016/j.patcog.2009.11.024zbMath1191.68499OpenAlexW2046251965WikidataQ58036772 ScholiaQ58036772MaRDI QIDQ962791
Beatriz Pérez-Sánchez, Amparo Alonso-Betanzos, Oscar Fontenla-Romero, Bertha Guijarro-Berdiñas
Publication date: 7 April 2010
Published in: Pattern Recognition (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.patcog.2009.11.024
convex optimizationleast squaresincremental learningglobal optimumsingle-layer neural networkssupervised learning method
Related Items (6)
Ensemble decision forest of RBF networks via hybrid feature clustering approach for high-dimensional data classification ⋮ Generalization bounds for non-stationary mixing processes ⋮ Nonlinear single layer neural network training algorithm for incremental, nonstationary and distributed learning scenarios ⋮ A joint optimization of momentum item and Levenberg-Marquardt algorithm to level up the BPNN's generalization ability ⋮ Asymptotic sequential Rademacher complexity of a finite function class ⋮ Sequential complexities and uniform martingale laws of large numbers
Uses Software
Cites Work
- Complexity of Solving Linear Systems in Different Models of Computation
- Fast recursive algorithms for a class of linear equations
- A Global Optimum Approach for One-Layer Neural Networks
- Multicategory proximal support vector machine classifiers
- Relative loss bounds for multidimensional regression problems
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: A new convex objective function for the supervised learning of single-layer neural networks