Directed drift: A new linear threshold algorithm for learning binary weights on-line (Q2366277): Difference between revisions

From MaRDI portal
Added link to MaRDI item.
Set OpenAlex properties.
 
(2 intermediate revisions by 2 users not shown)
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5519782 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4198056 / rank
 
Normal rank
Property / cites work
 
Property / cites work: How much information can one bit of memory retain about a Bernoulli sequence? / rank
 
Normal rank
Property / cites work
 
Property / cites work: Optimization by Simulated Annealing / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5538132 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Sequential analysis. Tests and confidence intervals / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Measure of Asymptotic Efficiency for Tests of a Hypothesis Based on the sum of Observations / rank
 
Normal rank
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.1016/0022-0000(93)90003-f / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W2051792956 / rank
 
Normal rank

Latest revision as of 10:08, 30 July 2024

scientific article
Language Label Description Also known as
English
Directed drift: A new linear threshold algorithm for learning binary weights on-line
scientific article

    Statements

    Directed drift: A new linear threshold algorithm for learning binary weights on-line (English)
    0 references
    29 June 1993
    0 references
    Learning real weights for a McCulloch-Pitts neuron is equivalent to linear programming and can hence be done in polynomial time. Efficient local learning algorithms such as Perceptron Learning further guarantee convergence in finite time. The problem becomes considerably harder, however, when it is sought to learn binary weights; this is equivalent to integer programming which is known to be NP-complete. A family of probabilistic algorithms which learn binary weights for a McCulloch-Pits neuron with inputs constrained to be binary is proposed here, the target functions being majority functions of a set literals. These algorithms have low computational demands and are essentially local in character. Rapid (average-case) quadratic rates of convergence for the algorithm are predicted analytically and confirmed through computer simulations when the number of examples is within capacity. It is also shown that, for the functions under consideration, Preceptron Learning converges rapidly (but to an, in general, nonbinary solution weight vector).
    0 references
    0 references
    McCulloch-Pitts neuron
    0 references
    linear programming
    0 references
    probabilistic algorithms
    0 references
    binary weights
    0 references
    0 references