INVESTIGADORES
ECHEVESTE Rodrigo SebastiÁn
congresos y reuniones científicas
Título:
Learning in Neural Models driven by Objective Functions
Autor/es:
ECHEVESTE, RODRIGO; GROS, CLAUDIUS
Lugar:
Osnabrück
Reunión:
Encuentro; Osnabrück Computational Cognition Alliance Meeting (Occam) 2013; 2013
Resumen:
A large number of neuronal models accounting for intrinsic plasticity and associative synaptic learning mechanisms have been proposed in the past, successfully performing in tasks such as principal component analysis or processing of natural images. These models, however, usually require either the addition of a weight decay term to the Hebbian learning rule, or an extra weight vector normalization step, to avoid unbounded weight growth.In the present work, learning rules for a neuronal model are polyhomeostatically derived from two objective functions. Interestingly, by minimizing the flux of the probablilty distribution in synaptic space during learning, we obtain rules that both account for Hebbian/anti-Hebbian learning and stabilize the system to avoid unbounded weight growth.As a first application of these rules, the single neuron case is studied in the context of principal component analysis and linear discrimination. We observe that the neuron works in different regimes in these two cases, successfully performing in both. Robustness to large input sizes (~ 5000 inputs) is also studied, observing that the neuron is still able to find the principal component in a distribution under these conditions.