Exact Event-Driven Implementation for Recurrent Networks of Stochastic Perfect Integrate-and-Fire Neurons

2012 ◽  
Vol 24 (12) ◽  
pp. 3145-3180 ◽  
Author(s):  
Thibaud Taillefumier ◽  
Jonathan Touboul ◽  
Marcelo Magnasco

In vivo cortical recording reveals that indirectly driven neural assemblies can produce reliable and temporally precise spiking patterns in response to stereotyped stimulation. This suggests that despite being fundamentally noisy, the collective activity of neurons conveys information through temporal coding. Stochastic integrate-and-fire models delineate a natural theoretical framework to study the interplay of intrinsic neural noise and spike timing precision. However, there are inherent difficulties in simulating their networks’ dynamics in silico with standard numerical discretization schemes. Indeed, the well-posedness of the evolution of such networks requires temporally ordering every neuronal interaction, whereas the order of interactions is highly sensitive to the random variability of spiking times. Here, we answer these issues for perfect stochastic integrate-and-fire neurons by designing an exact event-driven algorithm for the simulation of recurrent networks, with delayed Dirac-like interactions. In addition to being exact from the mathematical standpoint, our proposed method is highly efficient numerically. We envision that our algorithm is especially indicated for studying the emergence of polychronized motifs in networks evolving under spike-timing-dependent plasticity with intrinsic noise.

2013 ◽  
Vol 110 (7) ◽  
pp. 1672-1688 ◽  
Author(s):  
Bertrand Fontaine ◽  
Victor Benichoux ◽  
Philip X. Joris ◽  
Romain Brette

A challenge for sensory systems is to encode natural signals that vary in amplitude by orders of magnitude. The spike trains of neurons in the auditory system must represent the fine temporal structure of sounds despite a tremendous variation in sound level in natural environments. It has been shown in vitro that the transformation from dynamic signals into precise spike trains can be accurately captured by simple integrate-and-fire models. In this work, we show that the in vivo responses of cochlear nucleus bushy cells to sounds across a wide range of levels can be precisely predicted by deterministic integrate-and-fire models with adaptive spike threshold. Our model can predict both the spike timings and the firing rate in response to novel sounds, across a large input level range. A noisy version of the model accounts for the statistical structure of spike trains, including the reliability and temporal precision of responses. Spike threshold adaptation was critical to ensure that predictions remain accurate at different levels. These results confirm that simple integrate-and-fire models provide an accurate phenomenological account of spike train statistics and emphasize the functional relevance of spike threshold adaptation.


2020 ◽  
Author(s):  
Quinton M. Skilling ◽  
Brittany C. Clawson ◽  
Bolaji Eniwaye ◽  
James Shaver ◽  
Nicolette Ognjanovski ◽  
...  

SummarySleep plays a critical role in memory consolidation, although the exact mechanisms mediating this process are unknown. Combining computational and in vivo experimental approaches, we test the hypothesis that reduced cholinergic input to the hippocampus during non-rapid eye movement (NREM) sleep generates stable spike timing relationships between neurons. We find that the order of firing among neurons during a period of NREM sleep reflects their relative firing rates during prior wake, and changes as a function of prior learning. We show that learning-dependent pattern formation (e.g. “replay”) in the hippocampus during NREM, together with spike timing dependent plasticity (STDP), restructures network activity in a manner similar to that observed in brain circuits across periods of sleep. This suggests that sleep actively promotes memory consolidation by switching the network from rate-based to firing phase-based information encoding.


2018 ◽  
Author(s):  
Safura Rashid Shomali ◽  
Majid Nili Ahmadabadi ◽  
Seyyed Nader Rasuli ◽  
Hideaki Shimazaki

SummaryAn appealing challenge in Neuroscience is to identify network architecture from neural activity. A key requirement is the knowledge of statistical input-output relation of single neurons in vivo. Using a recent exact solution of spike-timing for leaky integrate-and-fire neurons under noisy inputs balanced near threshold, we construct a unified framework that links synaptic inputs, spiking nonlinearity, and network architecture, with statistics of population activity. The framework predicts structured higher-order interactions of neurons receiving common inputs under different architectures: It unveils two network motifs behind sparse activity reported in visual neurons. Comparing model’s prediction with monkey’s V1 neurons, we found excitatory inputs to pairs explain the sparse activity characterized by negative triple-wise interactions, ruling out shared inhibition. While the model predicts variation in the structured activity according to local circuitries, we show strong negative interactions are in general a signature of excitatory inputs to neuron pairs, whenever background activity is sparse.


2021 ◽  
Author(s):  
Yuri Elias Rodrigues ◽  
Cezar M. Tigaret ◽  
Hélène Marie ◽  
Cian O'Donnell ◽  
Romain Veltz

Synaptic plasticity rules used in current computational models of learning are generally insensitive to physiological factors such as spine voltage, animal age, extracellular fluid composition, and body temperature, limiting their predictive power. Here, we built a biophysically detailed synapse model inclusive of electrical dynamics, calcium-dependent signaling via CaMKII and Calcineurin (CaN) activities. The model combined multi-timescale variables, milliseconds to minutes, and intrinsic noise from stochastic ion channel gating. Analysis of the trajectories of joint CaMKII and CaN activities yielded an interpretable geometrical readout that fitted the synaptic plasticity outcomes of nine published ex vivo experiments covering various spike-timing and frequency-dependent plasticity induction protocols, animal ages, and experimental conditions. Using this new approach, we then generated maps predicting plasticity outcomes across the space of these stimulation conditions. Finally, we tested the model's robustness to in vivo-like spike time irregularity, showing that it significantly alters plasticity outcomes.


2004 ◽  
Vol 14 (06) ◽  
pp. 2061-2068 ◽  
Author(s):  
J. M. CASADO ◽  
J. P. BALTANÁS

Recently, it has been shown experimentally [Mainen & Sejnowski, 1995] that, in contrast to the lack of precision in spike timing associated with flat (dc) stimuli, neocortical neurons of rats respond reliably to weak input fluctuations resembling synaptic activity. This has led authors to suggest that, in spite of the high variability of interspike intervals found in cortical activity, the mechanism of spike generation in neocortical neurons has a low level of intrinsic noise. In this work we approach the problem of spike timing by using the well-known FitzHugh–Nagumo (FHN) model of neuronal dynamics and find that here also, fluctuating stimuli allow a more reliable temporal coding than constant suprathreshold signals. This result is associated with the characteristics of a phenomenological stochastic bifurcation taking place in the noisy FHN model.


2013 ◽  
Vol 110 (7) ◽  
pp. 1631-1645 ◽  
Author(s):  
R. C. Evans ◽  
Y. M. Maniar ◽  
K. T. Blackwell

The striatum of the basal ganglia demonstrates distinctive upstate and downstate membrane potential oscillations during slow-wave sleep and under anesthetic. The upstates generate calcium transients in the dendrites, and the amplitude of these calcium transients depends strongly on the timing of the action potential (AP) within the upstate. Calcium is essential for synaptic plasticity in the striatum, and these large calcium transients during the upstates may control which synapses undergo plastic changes. To investigate the mechanisms that underlie the relationship between calcium and AP timing, we have developed a realistic biophysical model of a medium spiny neuron (MSN). We have implemented sophisticated calcium dynamics including calcium diffusion, buffering, and pump extrusion, which accurately replicate published data. Using this model, we found that either the slow inactivation of dendritic sodium channels (NaSI) or the calcium inactivation of voltage-gated calcium channels (CDI) can cause high calcium corresponding to early APs and lower calcium corresponding to later APs. We found that only CDI can account for the experimental observation that sensitivity to AP timing is dependent on NMDA receptors. Additional simulations demonstrated a mechanism by which MSNs can dynamically modulate their sensitivity to AP timing and show that sensitivity to specifically timed pre- and postsynaptic pairings (as in spike timing-dependent plasticity protocols) is altered by the timing of the pairing within the upstate. These findings have implications for synaptic plasticity in vivo during sleep when the upstate-downstate pattern is prominent in the striatum.


2007 ◽  
Vol 19 (12) ◽  
pp. 3226-3238 ◽  
Author(s):  
Arnaud Tonnelier ◽  
Hana Belmabrouk ◽  
Dominique Martinez

Event-driven strategies have been used to simulate spiking neural networks exactly. Previous work is limited to linear integrate-and-fire neurons. In this note, we extend event-driven schemes to a class of nonlinear integrate-and-fire models. Results are presented for the quadratic integrate-and-fire model with instantaneous or exponential synaptic currents. Extensions to conductance-based currents and exponential integrate-and-fire neurons are discussed.


2006 ◽  
Vol 18 (12) ◽  
pp. 2959-2993 ◽  
Author(s):  
Eduardo Ros ◽  
Richard Carrillo ◽  
Eva M. Ortigosa ◽  
Boris Barbour ◽  
Rodrigo Agís

Nearly all neuronal information processing and interneuronal communication in the brain involves action potentials, or spikes, which drive the short-term synaptic dynamics of neurons, but also their long-term dynamics, via synaptic plasticity. In many brain structures, action potential activity is considered to be sparse. This sparseness of activity has been exploited to reduce the computational cost of large-scale network simulations, through the development of event-driven simulation schemes. However, existing event-driven simulations schemes use extremely simplified neuronal models. Here, we implement and evaluate critically an event-driven algorithm (ED-LUT) that uses precalculated look-up tables to characterize synaptic and neuronal dynamics. This approach enables the use of more complex (and realistic) neuronal models or data in representing the neurons, while retaining the advantage of high-speed simulation. We demonstrate the method's application for neurons containing exponential synaptic conductances, thereby implementing shunting inhibition, a phenomenon that is critical to cellular computation. We also introduce an improved two-stage event-queue algorithm, which allows the simulations to scale efficiently to highly connected networks with arbitrary propagation delays. Finally, the scheme readily accommodates implementation of synaptic plasticity mechanisms that depend on spike timing, enabling future simulations to explore issues of long-term learning and adaptation in large-scale networks.


2004 ◽  
Vol 92 (4) ◽  
pp. 2615-2621 ◽  
Author(s):  
Antonio G. Paolini ◽  
Janine C. Clarey ◽  
Karina Needham ◽  
Graeme M. Clark

Within the first processing site of the central auditory pathway, inhibitory neurons (D stellate cells) broadly tuned to tonal frequency project on narrowly tuned, excitatory output neurons (T stellate cells). The latter is thought to provide a topographic representation of sound spectrum, whereas the former is thought to provide lateral inhibition that improves spectral contrast, particularly in noise. In response to pure tones, the overall discharge rate in T stellate cells is unlikely to be suppressed dramatically by D stellate cells because they respond primarily to stimulus onset and provide fast, short-duration inhibition. In vivo intracellular recordings from the ventral cochlear nucleus (VCN) showed that, when tones were presented above or below the characteristic frequency (CF) of a T stellate neuron, they were inhibited during depolarization. This resulted in a delay in the initial action potential produced by T stellate cells. This ability of fast inhibition to alter the first spike timing of a T stellate neuron was confirmed by electrically activating the D stellate cell pathway that arises in the contralateral cochlear nucleus. Delay was also induced when two tones were presented: one at CF and one outside the frequency response area of the T stellate neuron. These findings suggest that the traditional view of lateral inhibition within the VCN should incorporate delay as one of its principle outcomes.


Sensors ◽  
2020 ◽  
Vol 20 (2) ◽  
pp. 500 ◽  
Author(s):  
Sergey A. Lobov ◽  
Andrey V. Chernyshov ◽  
Nadia P. Krilova ◽  
Maxim O. Shamshin ◽  
Victor B. Kazantsev

One of the modern trends in the design of human–machine interfaces (HMI) is to involve the so called spiking neuron networks (SNNs) in signal processing. The SNNs can be trained by simple and efficient biologically inspired algorithms. In particular, we have shown that sensory neurons in the input layer of SNNs can simultaneously encode the input signal based both on the spiking frequency rate and on varying the latency in generating spikes. In the case of such mixed temporal-rate coding, the SNN should implement learning working properly for both types of coding. Based on this, we investigate how a single neuron can be trained with pure rate and temporal patterns, and then build a universal SNN that is trained using mixed coding. In particular, we study Hebbian and competitive learning in SNN in the context of temporal and rate coding problems. We show that the use of Hebbian learning through pair-based and triplet-based spike timing-dependent plasticity (STDP) rule is accomplishable for temporal coding, but not for rate coding. Synaptic competition inducing depression of poorly used synapses is required to ensure a neural selectivity in the rate coding. This kind of competition can be implemented by the so-called forgetting function that is dependent on neuron activity. We show that coherent use of the triplet-based STDP and synaptic competition with the forgetting function is sufficient for the rate coding. Next, we propose a SNN capable of classifying electromyographical (EMG) patterns using an unsupervised learning procedure. The neuron competition achieved via lateral inhibition ensures the “winner takes all” principle among classifier neurons. The SNN also provides gradual output response dependent on muscular contraction strength. Furthermore, we modify the SNN to implement a supervised learning method based on stimulation of the target classifier neuron synchronously with the network input. In a problem of discrimination of three EMG patterns, the SNN with supervised learning shows median accuracy 99.5% that is close to the result demonstrated by multi-layer perceptron learned by back propagation of an error algorithm.


Sign in / Sign up

Export Citation Format

Share Document