摘要
Spiking Neural Network (SNN) contains neurons with sequential dynamics, synapses with plastic stability, and circuits with specific cognitive functions. SNN is biologically-plausible and can be tuned by integrating local-scale unsupervised learning rules (e.g., Spike Timing-Dependent Plasticity, Short-Term Plasticity, local equilibrium adjustment of membrane potential) and global-scale weak-supervised learning rules (e.g., dopamine-based reward learning, energy-based learning). Hence, it is powerful on spatially-temporal information representation, asynchronous processing of event-based information, and self-organized learning with dynamic topologies. SNN belongs to cross-discipline research areas of brain science and computer science. Hence, the research on it can be divided into two main types. One type is designed to understand better the biological system, where detailed biologically-realistic neural models are used without further consideration of computational efficiency. The other type is constructed to pursue superior computational performance, where only limited features of SNN are retained, and some efficient but not biologically-plausible tuning methods are still used, such as different versions of backpropagation. A detailed analysis of the research advances and model characteristics of these two types of efforts is given, including the following aspects: Firstly, the multi-type information encoding at neuronal scales is given, with event-based signal processing characteristics; Secondly, the multi-scale sparseness of network structures is defined with different subtypes of network motifs; Thirdly, the self-organized computation is shown at multi-scale clocks, from micro-scale at neurons (or synapses) to macro-scale at circuits; Fourthly, some vital functional characteristics of SNN are introduced, including energy-efficient computation (with spikes and learning rules) and robust computation (e.g., anti-environmental noises); Fifthly, the integration of SNN with neuromorphic hardware is shown for efficient non-Von-Neumann computation. After that, we will introduce a biologically-plausible strategy for well-tuning SNN by integrating multi-scale and multi-type plasticity rules inspired by natural neural networks and fine-tuning processes of state-of-the-art ANNs. This strategy provides an alternative effort for the efficient credit assignment for SNN, covering the whole network neurons, from readout, locally-hidden, to input neurons. It also gives us hints on answering the critical question, i.e., how biological neural networks can handle global network-tuning problems by integrating different types of local plasticity rules. These integrative principles will give SNN a possibly right tuning direction towards efficiently cognitive computation. Simultaneously, the success of SNN will also give inspirations back to the findings of new plasticity rules in natural neural networks. We think the goal of SNN is not just working as a biological candidate of ANNs, but constructing a new generation of effective artificial-intelligence models with characteristics of biologically-plausible cognitive computation by integrating theoretical breakthroughs in biology-inspired multi-scale plasticity principles, towards the faster learning convergence, lower energy cost, stronger adaptability, higher robust computation, and also better interpretability.
- 单位