INVESTIGADORES
ECHEVESTE Rodrigo SebastiÁn
congresos y reuniones científicas
Título:
Information-Theoretical Foundations of Hebbian Learning
Autor/es:
GROS, CLAUDIUS; ECHEVESTE, RODRIGO
Lugar:
Barcelona
Reunión:
Conferencia; 25th International Conference on Artificial Neural Networks; 2016
Resumen:
Neural information processing includes the extraction of information present in the statistics of afferent signals. For this, the afferent synaptic weights wj are continuously adapted, changing in turn the distribution pθ(y) of the post-synaptic neural activity y. Here θ denotes relevant neural parameters. The functional form of pθ(y) will hence continue to evolve as long as learning is on-going, becoming stationary only when learning is completed. This stationarity principle can be captured by theFisher information of the neural activity with respect to the afferent synaptic weights wj. It then follows, that Hebbian learning rules may be derived by minimizing Fθ. The precise functional form of the learning rules depends then on the shape of the transfer function y=g(x) relating the membrane potential x with the activity y.The learning rules derived from the stationarity principle are self-limiting (runaway synaptic growth does not occur), performing a standard principal component analysis, whenever a direction in the space of input activities with a large variance is present. Generically, directions of input activities having a negative excess Kurtosis are preferred, making the rules suitable for ICA (see figure). Moreover, when only the exponential foot of g is considered (low activity regime), the standard Hebbian learning rule, without reversal, is recovered.