SPIKING NEURAL NETWORKS

2009 ◽  
Vol 19 (04) ◽  
pp. 295-308 ◽  
Author(s):  
SAMANWOY GHOSH-DASTIDAR ◽  
HOJJAT ADELI

Most current Artificial Neural Network (ANN) models are based on highly simplified brain dynamics. They have been used as powerful computational tools to solve complex pattern recognition, function estimation, and classification problems. ANNs have been evolving towards more powerful and more biologically realistic models. In the past decade, Spiking Neural Networks (SNNs) have been developed which comprise of spiking neurons. Information transfer in these neurons mimics the information transfer in biological neurons, i.e., via the precise timing of spikes or a sequence of spikes. To facilitate learning in such networks, new learning algorithms based on varying degrees of biological plausibility have also been developed recently. Addition of the temporal dimension for information encoding in SNNs yields new insight into the dynamics of the human brain and could result in compact representations of large neural networks. As such, SNNs have great potential for solving complicated time-dependent pattern recognition problems because of their inherent dynamic representation. This article presents a state-of-the-art review of the development of spiking neurons and SNNs, and provides insight into their evolution as the third generation neural networks.

2013 ◽  
Vol 25 (2) ◽  
pp. 450-472 ◽  
Author(s):  
Jun Hu ◽  
Huajin Tang ◽  
K. C. Tan ◽  
Haizhou Li ◽  
Luping Shi

During the past few decades, remarkable progress has been made in solving pattern recognition problems using networks of spiking neurons. However, the issue of pattern recognition involving computational process from sensory encoding to synaptic learning remains underexplored, as most existing models or algorithms target only part of the computational process. Furthermore, many learning algorithms proposed in the literature neglect or pay little attention to sensory information encoding, which makes them incompatible with neural-realistic sensory signals encoded from real-world stimuli. By treating sensory coding and learning as a systematic process, we attempt to build an integrated model based on spiking neural networks (SNNs), which performs sensory neural encoding and supervised learning with precisely timed sequences of spikes. With emerging evidence of precise spike-timing neural activities, the view that information is represented by explicit firing times of action potentials rather than mean firing rates has been receiving increasing attention. The external sensory stimulation is first converted into spatiotemporal patterns using a latency-phase encoding method and subsequently transmitted to the consecutive network for learning. Spiking neurons are trained to reproduce target signals encoded with precisely timed spikes. We show that when a supervised spike-timing-based learning is used, different spatiotemporal patterns are recognized by different spike patterns with a high time precision in milliseconds.


SIMULATION ◽  
2011 ◽  
Vol 88 (3) ◽  
pp. 299-313 ◽  
Author(s):  
Guillermo L Grinblat ◽  
Hernán Ahumada ◽  
Ernesto Kofman

In this work, we explore the usage of quantized state system (QSS) methods in the simulation of networks of spiking neurons. We compare the simulation results obtained by these discrete-event algorithms with the results of the discrete time methods in use by the neuroscience community. We found that the computational costs of the QSS methods grow almost linearly with the size of the network, while they grows at least quadratically in the discrete time algorithms. We show that this advantage is mainly due to the fact that QSS methods only perform calculations in the components of the system that experience activity.


2020 ◽  
Author(s):  
Friedemann Zenke ◽  
Tim P. Vogels

AbstractBrains process information in spiking neural networks. Their intricate connections shape the diverse functions these networks perform. In comparison, the functional capabilities of models of spiking networks are still rudimentary. This shortcoming is mainly due to the lack of insight and practical algorithms to construct the necessary connectivity. Any such algorithm typically attempts to build networks by iteratively reducing the error compared to a desired output. But assigning credit to hidden units in multi-layered spiking networks has remained challenging due to the non-differentiable nonlinearity of spikes. To avoid this issue, one can employ surrogate gradients to discover the required connectivity in spiking network models. However, the choice of a surrogate is not unique, raising the question of how its implementation influences the effectiveness of the method. Here, we use numerical simulations to systematically study how essential design parameters of surrogate gradients impact learning performance on a range of classification problems. We show that surrogate gradient learning is robust to different shapes of underlying surrogate derivatives, but the choice of the derivative’s scale can substantially affect learning performance. When we combine surrogate gradients with a suitable activity regularization technique, robust information processing can be achieved in spiking networks even at the sparse activity limit. Our study provides a systematic account of the remarkable robustness of surrogate gradient learning and serves as a practical guide to model functional spiking neural networks.


2009 ◽  
Vol 19 (06) ◽  
pp. 465-471 ◽  
Author(s):  
JOSEP L. ROSSELLO ◽  
VINCENT CANALS ◽  
ANTONI MORRO ◽  
JAUME VERD

A new design of Spiking Neural Networks is proposed and fabricated using a 0.35 μm CMOS technology. The architecture is based on the use of both digital and analog circuitry. The digital circuitry is dedicated to the inter-neuron communication while the analog part implements the internal non-linear behavior associated to spiking neurons. The main advantages of the proposed system are the small area of integration with respect to digital solutions, its implementation using a standard CMOS process only and the reliability of the inter-neuron communication.


Sign in / Sign up

Export Citation Format

Share Document