ICC   25427
INSTITUTO DE INVESTIGACION EN CIENCIAS DE LA COMPUTACION
Unidad Ejecutora - UE
congresos y reuniones científicas
Título:
Modeling human visual search: A combined Bayesian searcher and saliency map approach for eye movement guidance in natural scenes
Autor/es:
GASTÓN BUJÍA; JUAN E KAMIENKOWSKI; SEBASTIAN VITA; MELANIE SCLAR; GULLERMO SOLOVEY
Lugar:
Virtual
Reunión:
Workshop; SVRHM 2020 - NeuroIPS workshop; 2020
Institución organizadora:
NeurIPS
Resumen:
Finding objects is essential for almost any daily-life visual task. Saliency models have been useful to predict fixation locations in natural images, but they provide no information about the time-sequence of fixations. Nowadays, one of the biggest challenges in the field is to go beyond saliency maps to predict a sequence of fixations related to a visual task, such as searching for a given target. Bayesian observer models have been proposed for this task, as they represent visual search as an active sampling process. Nevertheless, they were mostly evaluated on artificial images, and how they adapt to natural images remains largely unexplored.Here, we propose a unified Bayesian model for visual search guided by saliency maps as prior information. We validated our model with a visual search experiment in natural scenes recording eye movements. We show that, although state-of-the-art saliency maps are good to model bottom-up first impressions in a visual search task, their performance degrades to chance after the first fixations, when top-down task information is critical. Thus, we propose to use them as priors of Bayesian searchers. This approach leads to a behavior very similar to humans for the whole scanpath, both in the percentage of target found as a function of the fixation rank and the scanpath similarity, reproducing the entire sequence of eye movements.