Synchronization of spike-trains in a coupled system of digital spiking neurons

Author(s):  
Hiroaki Uchida ◽  
Toshimichi Saito
2015 ◽  
Vol 25 (07) ◽  
pp. 1540005
Author(s):  
Ilya Prokin ◽  
Ivan Tyukin ◽  
Victor Kazantsev

The work investigates the influence of spike-timing dependent plasticity (STDP) mechanisms on the dynamics of two synaptically coupled neurons driven by additive external noise. In this setting, the noise signal models synaptic inputs that the pair receives from other neurons in a larger network. We show that in the absence of STDP feedbacks the pair of neurons exhibit oscillations and intermittent synchronization. When the synapse connecting the neurons is supplied with a phase selective feedback mechanism simulating STDP, induced dynamics of spikes in the coupled system resembles a phase locked mode with time lags between spikes oscillating about a specific value. This value, as we show by extensive numerical simulations, can be set arbitrary within a broad interval by tuning parameters of the STDP feedback.


2005 ◽  
Vol 17 (11) ◽  
pp. 2337-2382 ◽  
Author(s):  
Robert Legenstein ◽  
Christian Naeger ◽  
Wolfgang Maass

Spiking neurons are very flexible computational modules, which can implement with different values of their adjustable synaptic parameters an enormous variety of different transformations F from input spike trains to output spike trains. We examine in this letter the question to what extent a spiking neuron with biologically realistic models for dynamic synapses can be taught via spike-timing-dependent plasticity (STDP) to implement a given transformation F. We consider a supervised learning paradigm where during training, the output of the neuron is clamped to the target signal (teacher forcing). The well-known perceptron convergence theorem asserts the convergence of a simple supervised learning algorithm for drastically simplified neuron models (McCulloch-Pitts neurons). We show that in contrast to the perceptron convergence theorem, no theoretical guarantee can be given for the convergence of STDP with teacher forcing that holds for arbitrary input spike patterns. On the other hand, we prove that average case versions of the perceptron convergence theorem hold for STDP in the case of uncorrelated and correlated Poisson input spike trains and simple models for spiking neurons. For a wide class of cross-correlation functions of the input spike trains, the resulting necessary and sufficient condition can be formulated in terms of linear separability, analogously as the well-known condition of learnability by perceptrons. However, the linear separability criterion has to be applied here to the columns of the correlation matrix of the Poisson input. We demonstrate through extensive computer simulations that the theoretically predicted convergence of STDP with teacher forcing also holds for more realistic models for neurons, dynamic synapses, and more general input distributions. In addition, we show through computer simulations that these positive learning results hold not only for the common interpretation of STDP, where STDP changes the weights of synapses, but also for a more realistic interpretation suggested by experimental data where STDP modulates the initial release probability of dynamic synapses.


1993 ◽  
Vol 5 (1) ◽  
pp. 21-31 ◽  
Author(s):  
Leonid Kruglyak ◽  
William Bialek

We show that a simple statistical mechanics model can capture the collective behavior of large networks of spiking neurons. Qualitative arguments suggest that regularly firing neurons should be described by a planar "spin" of unit length. We extract these spins from spike trains and then measure the interaction Hamiltonian using simulations of small clusters of cells. Correlations among spike trains obtained from simulations of large arrays of cells are in quantitative agreement with the predictions from these Hamiltonians. We comment on the novel computational abilities of these "XY networks."


2009 ◽  
Vol 21 (11) ◽  
pp. 2991-3009 ◽  
Author(s):  
Lucas C. Parra ◽  
Jeffrey M. Beck ◽  
Anthony J. Bell

A feedforward spiking network represents a nonlinear transformation that maps a set of input spikes to a set of output spikes. This mapping transforms the joint probability distribution of incoming spikes into a joint distribution of output spikes. We present an algorithm for synaptic adaptation that aims to maximize the entropy of this output distribution, thereby creating a model for the joint distribution of the incoming point processes. The learning rule that is derived depends on the precise pre- and postsynaptic spike timings. When trained on correlated spike trains, the network learns to extract independent spike trains, thereby uncovering the underlying statistical structure and creating a more efficient representation of the incoming spike trains.


10.29007/2k64 ◽  
2018 ◽  
Author(s):  
Pat Prodanovic ◽  
Cedric Goeury ◽  
Fabrice Zaoui ◽  
Riadh Ata ◽  
Jacques Fontaine ◽  
...  

This paper presents a practical methodology developed for shape optimization studies of hydraulic structures using environmental numerical modelling codes. The methodology starts by defining the optimization problem and identifying relevant problem constraints. Design variables in shape optimization studies are configuration of structures (such as length or spacing of groins, orientation and layout of breakwaters, etc.) whose optimal orientation is not known a priori. The optimization problem is solved numerically by coupling an optimization algorithm to a numerical model. The coupled system is able to define, test and evaluate a multitude of new shapes, which are internally generated and then simulated using a numerical model. The developed methodology is tested using an example of an optimum design of a fish passage, where the design variables are the length and the position of slots. In this paper an objective function is defined where a target is specified and the numerical optimizer is asked to retrieve the target solution. Such a definition of the objective function is used to validate the developed tool chain. This work uses the numerical model TELEMAC- 2Dfrom the TELEMAC-MASCARET suite of numerical solvers for the solution of shallow water equations, coupled with various numerical optimization algorithms available in the literature.


Sign in / Sign up

Export Citation Format

Share Document