A new convex objective function for the supervised learning of single-layer neural networks
From MaRDI portal
Publication:962791
DOI10.1016/j.patcog.2009.11.024zbMath1191.68499WikidataQ58036772 ScholiaQ58036772MaRDI QIDQ962791
Beatriz Pérez-Sánchez, Amparo Alonso-Betanzos, Oscar Fontenla-Romero, Bertha Guijarro-Berdiñas
Publication date: 7 April 2010
Published in: Pattern Recognition (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.patcog.2009.11.024
convex optimization; least squares; incremental learning; global optimum; single-layer neural networks; supervised learning method
68T05: Learning and adaptive systems in artificial intelligence
Related Items
Nonlinear single layer neural network training algorithm for incremental, nonstationary and distributed learning scenarios, Asymptotic sequential Rademacher complexity of a finite function class, Ensemble decision forest of RBF networks via hybrid feature clustering approach for high-dimensional data classification, A joint optimization of momentum item and Levenberg-Marquardt algorithm to level up the BPNN's generalization ability, Sequential complexities and uniform martingale laws of large numbers, Generalization bounds for non-stationary mixing processes
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Complexity of Solving Linear Systems in Different Models of Computation
- Fast recursive algorithms for a class of linear equations
- A Global Optimum Approach for One-Layer Neural Networks
- Multicategory proximal support vector machine classifiers
- Relative loss bounds for multidimensional regression problems