INVESTIGADORES
MEDAN Violeta
congresos y reuniones científicas
Título:
Stimulus salience, spatial and temporal correspondence determine enhancement or depression of multisensory integration in fish
Autor/es:
MARTORELL, NICOLÁS; PERARA, MILENA; MEDAN, VIOLETA
Lugar:
Chicago
Reunión:
Congreso; Neuroscience 2019; 2019
Institución organizadora:
Society for Neuroscience
Resumen:
Most animals combine multiple sources of information to form a coherent percept of the world for adaptive behavioral decisions. The underlying mechanisms of such interactions are far from clear. Here we ask: (i) What are the parameters of a combination of stimuli which determine if integration occurs, and (ii) How is response latency modified by multisensory integration? We answer these questions analyzing the C-start escapes of goldfish (Carassius auratus) in response to auditory pips and visual looming stimuli.We first described the response probability and response latency to 6 levels of unimodal auditory or visual stimuli of increasing strength. Varying the volume of auditory stimuli and the contrast of visual looms allowed us to obtain unimodal stimuli ranging from low to high salience (70% response probability). To determine which combinations produced multisensory integration, we created a multimodal stimuli matrix combining these 6 auditory and 6 visual stimuli. Experiments in 60 animals showed that the strongest multisensory enhancement occurs when both stimuli have minimum salience while it disappears as salience increases.We next described how integration changes when spatial correspondence between the auditory and visual components is altered. For low salience combinations, we observed multisensory enhancement when both components came from the same direction, multisensory depression when they were 90° apart and no integration at 180° separation. For higher salience combinations, integration was non-significant regardless of spatial separation.Finally, we analyzed C-start response latency for auditory or visual stimuli with respect to the auditory pip onset or the end of the looming expansion respectively. As expected, brief (5 ms) auditory pips evoked responses with a narrow latency distribution (median = 12.5 ms, 25-75th percentile = 8 to 17 ms, N = 56) compared to longer lasting (4000 ms) visual looms (median = -29 ms, 25-75th percentile = -166 to 30 ms, N = 67). Interestingly, when we presented an auditory pip 116 ms before the end of the loom expansion, the multimodal responses were centered around the auditory latency distribution and showed the same range (median = 12.5 ms, 25 to 75th percentile = 8 to 17 ms, N = 215).Overall we found multisensory enhancement of weak but spatially congruent audio-visual stimuli whose response latency was determined by the auditory stimulus. Ongoing experiments are testing the temporal dependence of these audio-visual stimuli to determine to which extent the auditory stimulus dictates response latency.