INVESTIGADORES
CAIAFA Cesar Federico
artículos
Título:
Brain Simulation and Spiking Neural Networks
Autor/es:
SUN, ZHE; CUTSURIDIS, VASSILIS; CAIAFA, CESAR F.; SOLÉ-CASALS, JORDI
Revista:
Cognitive Computation
Editorial:
Springer
Referencias:
Año: 2023
ISSN:
1866-9956
Resumen:
Over the past 10 years, numerous experiments have been conducted to understand the structure and functioning of the brain. Meanwhile, the field of neuroscience simulation has emerged as a crucial strategy for investigating brain function. The US Human Connectome Project, which began in 2009, aims to provide a large dataset mapping the human brain that has already been tested against various brain models. Similarly, the European Union’s Human Brain Project is building a large-scale virtual simulation of the rodent brain to reveal various brain activities. In Japan, a human-scale brain simulation has been created on the Fugaku supercomputer to investigate how neural networks are involved in thought processes.Simulating brain circuits and neural dynamics can reveal the fundamentals of cognitive computation and control mechanisms. This understanding is crucial not only for advancing our knowledge of the brain but also for improving existing artificial intelligent systems and building algorithms that approach the level of brain intelligence. Exploring the latest advances and potential of brain simulation-related technologies will undoubtedly ease the way towards exciting discoveries and advances in our understanding of the complexity of the brain. By expanding our knowledge in this field, we can open up new possibilities for diagnosing and treating neurological disorders, creating brain-inspired algorithms, and developing advanced cognitive systems. Integrating diverse brain simulation research efforts will foster collaboration and innovation, ultimately bringing us closer to unlocking the mysteries of the human brain. As we delve deeper into the field of simulation, we can gain unprecedented insight into the intricate workings of our minds and unlock the potential for revolutionary breakthroughs in neuroscience and artificial intelligence.For this special issue, we invited researchers to present their cutting-edge approaches to brain simulation. This special issue focuses on a variety of topics, such as data analysis methods for brain connectivity, the development of brain simulation platforms, spiking neural networks (SNNs) for modeling brain circuits, and applications of SNNs in real-world scenarios. In this editorial, we are pleased to provide a succinct summary of each of the articles included, encouraging researchers to delve deeper into these manuscripts and contribute to the advancement of methods and theories related to the topics presented. Each article offers a unique perspective and valuable insights, providing an opportunity for further exploration and collaboration. By presenting these concise descriptions, we aim to capture the attention of researchers and pique their curiosity to delve into the intricate details of each article. We encourage scientists to take an interest in these manuscripts, build on the work presented, and contribute to continued progress in understanding and simulating the complexities of the brain. Together, we can drive innovation and push the boundaries of neuroscience and artificial intelligence to ultimately advance our understanding of the human mind.The connectome plays a crucial role in brain simulation as it provides a detailed map of neuronal connections in the brain. This information is essential for developing accurate models of brain function and effective treatments for brain disorders. In their research, Zhang et al. [1] propose a data-driven approach to identify the most informative features that predict autism spectrum disorder (ASD) using functional connectivity. They use the F-score method to calculate the discriminative power of each feature and rank them based on their importance in predicting ASD. Subsequently, an autoencoder model is employed to learn the underlying patterns in the selected features and classify individuals as ASD or non-ASD. The results demonstrate that this approach achieves an accuracy of 70.9% on the ABIDE dataset. Overall, this study makes a valuable contribution to the field of ASD classification based on functional connectivity.To further investigate the potential of their platform, Kobayashi et al. simulated a Purkinje neuron model with a complex dendritic morphology and analyzed its electrical behavior [2]. The study results demonstrate that the platform can accurately simulate the intricate electrical behavior of neurons with complex morphological structures. The authors suggest that this platform could be employed in various applications, including the investigation of neural coding, synaptic plasticity, and neural network computation. Overall, this study provides a valuable contribution to the field of neuromorphic computing by presenting a platform for simulating complex neuron models with morphological structures.SNN-based brain simulation platforms are extensively used in various research applications, such as investigating the neural underpinnings of cognition and behavior, understanding the mechanisms behind neurological disorders like epilepsy and Parkinson’s disease, and developing innovative technologies for brain-machine interfaces and neuroprosthetics. In our special issue, Cakan et al. introduced a notable contribution in the form of an open-source brain stimulation platform called neurolib [3]. This framework enables the construction of brain models by exporting both structural and functional connectome data. Furthermore, neurolib incorporates an optimization module that minimizes the discrepancy between the oscillation frequency of the simulated activity and the target frequency.The study conducted by Cheng et al. introduces a novel spiking reinforcement learning algorithm called Spiking Memory TD3 (SM-TD3). This algorithm is designed to address high-dimensional control problems that involve partially observable Markov decision processes. SM-TD3 achieves this by combining a spiking Long Short-Term Memory (Spiking-LSTM) policy network with a deep critic network. The input encoding of the algorithm employs population-encoding, while the spiking-LSTM component provides an effective memory function. The parameters of SM-TD3 are trained using spatial-temporal backpropagation. Through benchmark tests using Pybullet, SM-TD3 has demonstrated remarkable robustness and adaptability across various scenarios. Additionally, when compared to deep LSTM-TD3, SM-TD3 exhibits a reduction in energy consumption by 20–50%, making it suitable for both high-performance and energy-efficient practical applications [4].Xue et al. proposed a biologically inspired excitatory-inhibitory spiking neural network to study rule-dependent working memory tasks [5]. This computational platform allows for investigating the neural representations and dynamics of cortical circuits during complex cognitive tasks. The network’s architecture accurately mimics the behavior of cortical circuits, providing insights into how information is encoded and maintained in working memory. This work contributes to our understanding of the neural basis of cognitive functions and enhances our ability to study cortical dynamics at a fine timescale.The hippocampus plays a crucial role in various cognitive functions, such as memory formation, consolidation, retrieval, spatial navigation, and perception. Kopsick and colleagues have developed a robust and biologically realistic model of the Area CA3 in the mouse hippocampus [6]. This model incorporates essential experimental data, including detailed information about neuron types, neuron density, and connectivity patterns. To achieve efficient and accurate computation of the network dynamics, the simulation was conducted using a multi-GPU computing system and the CARLsim4 simulator. This work presents a valuable tool for investigating the neural mechanisms underlying hippocampal function and holds potential implications for understanding and addressing memory-related disorders.Luboeinski and Tetzlaff [7] developed a recurrent network of spiking neurons that integrates biological spiking neural networks, synaptic plasticity, and synaptic consolidation. The purpose of this model is to simulate the learning and consolidation processes involved in long-term memory representations. By incorporating these mechanisms, the model offers a fresh perspective and mechanistic understanding of cognitive effects such as recency and priming. This research contributes to our comprehension of how neural networks function and provides insights into the underlying processes involved in memory formation and organization.On the other side, Salustri and Micheletto [8] constructed a toroidal spiking neural network using the NEST simulator and the Izhikevich neuron model in their study. The analysis of network activity yielded significant results, revealing that a noisy and inhomogeneous structural fabric of neural networks facilitates signal transmission. This finding provides new insights into the role of noise in biological systems, expanding our understanding by extending the concept of stochastic enhancement to the physical structure of neural networks. The study conducted by Marcello et al. contributes to our comprehension of the intricate dynamics of neural networks and their relationship to signal processing, shedding light on the broader implications of noise in biological systems.When applying SNNs in real-world scenarios, Shaw et al. introduced a novel feature extraction method called the 1D Multi-Point Local Ternary Pattern (1D-MPLTP) for brain wave signals [9]. To assess the method’s efficacy, the authors converted the ternary patterns obtained from their proposed method into binary patterns.The hand is a crucial part of the human body, and amputation of the hand significantly impacts the individual’s quality of life (QoL). In recent years, there has been a rise in research focused on bionic prosthetic hands, aiming to effectively enhance the QoL of individuals with hand amputations. In the study conducted by Yang et al. [10], an SNN-based system utilizing surface electromyography (sEMG) was investigated. To encode the sEMG signals, the researchers proposed the use of a smoothed frequency-domain decomposition encoder, which effectively converts the sEMG into stable spike trains. As for the SNN’s output decoding, the researchers introduced the network efferent energy decoder, which translates the membrane potential into recognition results. The study involved eleven subjects, and the results indicated a gesture recognition accuracy of 91.21%, demonstrating the potential of implementing these proposed methods in actual prostheses.Finally, in their study, Mark Crook-Rumsey and colleagues used event-related potentials as inputs to a spiking neural network for the detection of mild cognitive impairment [11]. To achieve this, they devised a method to convert spatiotemporal electrophysiological data into spike sequences. These spike sequences were then utilized to train a 3D spiking neural network model for signal classification. The findings of the study demonstrated the effectiveness of their methods, providing interpretable results.The guest editors sincerely express their gratitude to the anonymous reviewers for their invaluable contributions in evaluating the manuscripts for this special issue. The insightful evaluations provided by the reviewers have played a pivotal role in upholding the high standards of the publication. The meticulous feedback and constructive criticisms offered by the reviewers have significantly enhanced the quality of the articles and enriched the overall value of this special issue. Additionally, the guest editors would like to extend their deepest appreciation to the editor-in-chief for their invaluable guidance and support throughout the entire editorial process.