Energy-Efficient Models for High-Dimensional Spike Train Classification using Sparse Spiking Neural Networks

Author(s):  
Hang Yin ◽  
John Boaz Lee ◽  
Xiangnan Kong ◽  
Thomas Hartvigsen ◽  
Sihong Xie
2021 ◽  
Author(s):  
Ceca Kraišniković ◽  
Wolfgang Maass ◽  
Robert Legenstein

The brain uses recurrent spiking neural networks for higher cognitive functions such as symbolic computations, in particular, mathematical computations. We review the current state of research on spike-based symbolic computations of this type. In addition, we present new results which show that surprisingly small spiking neural networks can perform symbolic computations on bit sequences and numbers and even learn such computations using a biologically plausible learning rule. The resulting networks operate in a rather low firing rate regime, where they could not simply emulate artificial neural networks by encoding continuous values through firing rates. Thus, we propose here a new paradigm for symbolic computation in neural networks that provides concrete hypotheses about the organization of symbolic computations in the brain. The employed spike-based network models are the basis for drastically more energy-efficient computer hardware – neuromorphic hardware. Hence, our results can be seen as creating a bridge from symbolic artificial intelligence to energy-efficient implementation in spike-based neuromorphic hardware.


2021 ◽  
pp. 182-194
Author(s):  
Yihao Luo ◽  
Min Xu ◽  
Caihong Yuan ◽  
Xiang Cao ◽  
Liangqi Zhang ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document