Fast linear adaptive skipping training algorithm for training artificial neural network (Q459793): Difference between revisions
From MaRDI portal
Set OpenAlex properties. |
ReferenceBot (talk | contribs) Changed an Item |
||
Property / cites work | |||
Property / cites work: Q4877229 / rank | |||
Normal rank |
Latest revision as of 03:19, 9 July 2024
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Fast linear adaptive skipping training algorithm for training artificial neural network |
scientific article |
Statements
Fast linear adaptive skipping training algorithm for training artificial neural network (English)
0 references
13 October 2014
0 references
Summary: Artificial neural network has been extensively consumed training model for solving pattern recognition tasks. However, training a very huge training data set using complex neural network necessitates excessively high training time. In this correspondence, a new fast Linear Adaptive Skipping Training (LAST) algorithm for training artificial neural network (ANN) is instituted. The core essence of this paper is to ameliorate the training speed of ANN by exhibiting only the input samples that do not categorize perfectly in the previous epoch which dynamically reducing the number of input samples exhibited to the network at every single epoch without affecting the network's accuracy. Thus decreasing the size of the training set can reduce the training time, thereby ameliorating the training speed. This LAST algorithm also determines how many epochs the particular input sample has to skip depending upon the successful classification of that input sample. This LAST algorithm can be incorporated into any supervised training algorithms. Experimental result shows that the training speed attained by LAST algorithm is preferably higher than that of other conventional training algorithms.
0 references