INVESTIGADORES
ARNEODO Ezequiel Matias
congresos y reuniones científicas
Título:
Low-dimensional neuronal population dynamics tied to complex vocal behavior
Autor/es:
EZEQUIEL M. ARNEODO
Lugar:
San Diego
Reunión:
Simposio; 17th Kavli Institute for the Brain and Mind innovative research symposium.; 2022
Institución organizadora:
Kavli Institute for the Brain and Mind
Resumen:
Complex volitional motor actions require the coordinated orchestration of neuronal population activity patterns, distributed across multiple brain regions, and converging onto effector organs that produce behaviors. The nature of these neuronal interactions, particularly in the service of vocal communication signals such as speech, has historically been difficult to address. We will overcome these difficulties through an interdisciplinary approach that brings together state-of-the-art high-density electrophysiological recording technology with sophisticated computational methods in a model system ideal for studying learned sequential vocal behavior. We propose to record electrophysiological activity from large populations of single neurons simultaneously across multiple premotor and motor brain regions in singing birds, and to model quantitatively the temporal dynamics of key nodes in the mesoscale network that underlies this behavior. Drawing inspiration from limb motor control and our own work on the sequential structure of birdsong, we will determine how activity in neural populations in the song motor pathway can be represented by distinct low-dimensional state-space trajectories within regions and how trajectories in different regions can be coupled. We hypothesize that the temporal dynamics of these neural trajectories, and their coupling, are governed by the same sequential statistical models that describe the short and long range temporal dependencies found between vocal elements in song and phonemes in human speech. Ultimately, our results will help establish direct quantitative connections between the dynamics of neural populations, neural networks, and the output of the peripheral effectors (i.e. behavior). This framework will enable a range of unique experiments in this model system aimed at, for example, understanding how the basal ganglia shape these temporal population dynamics during vocal learning and how complex auditory feedback is integrated to these networks in real time to guide ongoing behavior. Both learning and real-time maintenance are vital to the future success of model-based brain machine interfaces. We note in addition that our methods directly address emerging progress goals in explainable AI (XAI), which seeks a deeper understanding of applied machine learning algorithms and advances through coupling to intuitive neurobiological mechanisms and behaviors.