Neurodynamical classifiers with low model complexity

From MaRDI portal
Publication:2057774

DOI10.1016/J.NEUNET.2020.08.013zbMATH Open1475.68290arXiv1503.03148OpenAlexW1859904220WikidataQ100431610 ScholiaQ100431610MaRDI QIDQ2057774FDOQ2057774


Authors: Himanshu Pant, Sumit Soman, Amit Bhaya, Basabi Bhaumik Jayadeva Edit this on Wikidata


Publication date: 7 December 2021

Published in: Neural Networks (Search for Journal in Brave)

Abstract: The recently proposed Minimal Complexity Machine (MCM) finds a hyperplane classifier by minimizing an exact bound on the Vapnik-Chervonenkis (VC) dimension. The VC dimension measures the capacity of a learning machine, and a smaller VC dimension leads to improved generalization. On many benchmark datasets, the MCM generalizes better than SVMs and uses far fewer support vectors than the number used by SVMs. In this paper, we describe a neural network based on a linear dynamical system, that converges to the MCM solution. The proposed MCM dynamical system is conducive to an analogue circuit implementation on a chip or simulation using Ordinary Differential Equation (ODE) solvers. Numerical experiments on benchmark datasets from the UCI repository show that the proposed approach is scalable and accurate, as we obtain improved accuracies and fewer number of support vectors (upto 74.3% reduction) with the MCM dynamical system.


Full work available at URL: https://arxiv.org/abs/1503.03148




Recommendations




Cites Work


Cited In (4)

Uses Software





This page was built for publication: Neurodynamical classifiers with low model complexity

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2057774)