IFEG   20353
INSTITUTO DE FISICA ENRIQUE GAVIOLA
Unidad Ejecutora - UE
artículos
Título:
A Tsallis` statistics based neural network model for novel word learning
Autor/es:
TARIK HADZIBEGANOVIC; SERGIO A. CANNAS
Revista:
PHYSICA A - STATISTICAL AND THEORETICAL PHYSICS
Editorial:
Elsevier Science B.V.
Referencias:
Año: 2009 vol. 388 p. 732 - 746
ISSN:
0378-4371
Resumen:
We invoke the Tsallis entropy formalism, a nonextensive entropy measure, to include somedegree of non-locality in a neural network that is used for simulation of novel word learningin adults. A generalization of the gradient descent dynamics, realized via nonextensive costfunctions, is used as a learning rule in a simple perceptron. The model is first investigatedfor general properties, and then tested against the empirical data, gathered from simplememorization experiments involving two populations of linguistically different subjects.Numerical solutions of the model equations corresponded to the measured performancestates of human learners. In particular, we found that the memorization tasks wereexecuted with rather small but population-specific amounts of nonextensivity, quantifiedby the entropic index q. Our findings raise the possibility of using entropic nonextensivity asa means of characterizing the degree of complexity of learning in both natural and artificialsystems.