A biological gradient descent for prediction through a combination of STDP and homeostatic plasticity
DOI10.1162/NECO_A_00512zbMATH Open1415.92017arXiv1206.4812WikidataQ44739266 ScholiaQ44739266MaRDI QIDQ5378278FDOQ5378278
Authors: Gilles Wainrib, Mathieu N. Galtier
Publication date: 12 June 2019
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1206.4812
Recommendations
- Evaluating the extent to which homeostatic plasticity learns to compute prediction errors in unstructured neuronal networks
- Adaptive Synchronization of Activities in a Recurrent Network
- Spike-timing-dependent Hebbian plasticity as temporal difference learning
- Predictive learning in rate-coded neuronal networks: a theoretical approach towards classical conditioning
- Fading memory and time series prediction in recurrent networks with different forms of plasticity
Learning and adaptive systems in artificial intelligence (68T05) Neural networks for/in biological studies, artificial life and related topics (92B20)
Cites Work
- Nonlinear systems
- Averaging methods in nonlinear dynamical systems
- Spiking Neuron Models
- Bayesian Spiking Neurons I: Inference
- Relating STDP to BCM
- A simplified neuron model as a principal component analyzer
- Spike-timing-dependent Hebbian plasticity as temporal difference learning
- Recognizing recurrent neural networks (rRNN): Bayesian inference for recurrent neural net\-works
- Inverse Problems in Neural Field Theory
- Reservoir computing approaches to recurrent neural network training
- Why spikes? Hebbian learning and retrieval of time-resolved excitation patterns
- Fading memory and time series prediction in recurrent networks with different forms of plasticity
- Multiscale analysis of slow-fast neuronal learning models with noise
Cited In (7)
- A local echo state property through the largest Lyapunov exponent
- Dynamic branching in a neural network model for probabilistic prediction of sequences
- Distributed synaptic weights in a LIF neural network and learning rules
- Relative entropy minimizing noisy non-linear neural network to approximate stochastic processes
- Evaluating the extent to which homeostatic plasticity learns to compute prediction errors in unstructured neuronal networks
- Slow feature analysis with spiking neurons and its application to audio stimuli
- Fading memory and time series prediction in recurrent networks with different forms of plasticity
This page was built for publication: A biological gradient descent for prediction through a combination of STDP and homeostatic plasticity
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5378278)