Improved storage capacity of hebbian learning attractor neural network with bump formations
Entity
UAM. Departamento de Ingeniería InformáticaPublisher
Springer Berlin HeidelbergDate
2006Citation
10.1007/11840817_25
Artificial Neural Networks – ICANN 2006: 16th International Conference, Athens, Greece, September 10-14, 2006. Proceedings, Part I. Lecture Notes in Computer Science, Volumen 4131. Springer, 2006. 234-243.
ISSN
0302-9743 (print); 1611-3349 (online)ISBN
978-3-540-38625-4 (print); 978-3-540-38627-8 (online)DOI
10.1007/11840817_25Funded by
The authors acknowledge the financial support from the Spanish Grants DGI.M. CyT. FIS2005-1729, Plan de Promoción de la Investigación UNED and TIN 2004–07676-G01-01.We also thank David Dominguez for the fruitful discussion of the manuscript.Editor's Version
http://dx.doi.org/10.1007/11840817_25Subjects
Computation by Abstract Devices; Pattern Recognition; Information Systems Applications; Database Management; InformáticaNote
The final publication is available at Springer via http://dx.doi.org/10.1007/11840817_25Proceedings of 16th International Conference on Artificial Neural Networks, Athens, Greece, September 10-14, 2006, Part I
Rights
© Springer-Verlag Berlin Heidelberg 2006Abstract
Recently, bump formations in attractor neural networks with distance dependent connectivities has become of increasing interest for investigation in the field of biological and computational neuroscience. Although the distance dependent connectivity is common in biological networks, a common fault of these network is the sharp drop of the number of patterns p that can remembered, when the activity changes from global to bump-like, than effectively makes these networks low effective.
In this paper we represent a bump-based recursive network specially designed in order to increase its capacity, which is comparable with that of randomly connected sparse network. To this aim, we have tested a selection of 700 natural images on a network with N = 64K neurons with connectivity per neuron C. We have shown that the capacity of the network is of order of C, that is in accordance with the capacity of highly diluted network. Preserving the number of connections per neuron, a non-trivial behavior with the radius of the connectivity has been observed. Our results show that the decrement of the capacity of the bumpy network can be avoided.
Files in this item
Google Scholar:Nedeltchev Koroutchev, Kostadin
-
Korutcheva, Elka
This item appears in the following Collection(s)
Related items
Showing items related by title, author, creator and subject.