Show simple item record

dc.contributor.authorNedeltchev Koroutchev, Kostadin 
dc.contributor.authorKorutcheva, Elka
dc.contributor.otherUAM. Departamento de Ingeniería Informáticaes_ES
dc.date.accessioned2015-05-13T15:44:47Z
dc.date.available2015-05-13T15:44:47Z
dc.date.issued2006
dc.identifier.citationArtificial Neural Networks – ICANN 2006: 16th International Conference, Athens, Greece, September 10-14, 2006. Proceedings, Part I. Lecture Notes in Computer Science, Volumen 4131. Springer, 2006. 234-243.en_US
dc.identifier.isbn978-3-540-38625-4 (print)en_US
dc.identifier.isbn978-3-540-38627-8 (online)en_US
dc.identifier.issn0302-9743 (print)en_US
dc.identifier.issn1611-3349 (online)en_US
dc.identifier.urihttp://hdl.handle.net/10486/666193
dc.descriptionThe final publication is available at Springer via http://dx.doi.org/10.1007/11840817_25en_US
dc.descriptionProceedings of 16th International Conference on Artificial Neural Networks, Athens, Greece, September 10-14, 2006, Part Ien_US
dc.description.abstractRecently, bump formations in attractor neural networks with distance dependent connectivities has become of increasing interest for investigation in the field of biological and computational neuroscience. Although the distance dependent connectivity is common in biological networks, a common fault of these network is the sharp drop of the number of patterns p that can remembered, when the activity changes from global to bump-like, than effectively makes these networks low effective. In this paper we represent a bump-based recursive network specially designed in order to increase its capacity, which is comparable with that of randomly connected sparse network. To this aim, we have tested a selection of 700 natural images on a network with N = 64K neurons with connectivity per neuron C. We have shown that the capacity of the network is of order of C, that is in accordance with the capacity of highly diluted network. Preserving the number of connections per neuron, a non-trivial behavior with the radius of the connectivity has been observed. Our results show that the decrement of the capacity of the bumpy network can be avoided.en_US
dc.description.sponsorshipThe authors acknowledge the financial support from the Spanish Grants DGI.M. CyT. FIS2005-1729, Plan de Promoción de la Investigación UNED and TIN 2004–07676-G01-01.We also thank David Dominguez for the fruitful discussion of the manuscript.en_US
dc.format.extent11 pág.es_ES
dc.format.mimetypeapplication/pdfen
dc.language.isoengen
dc.publisherSpringer Berlin Heidelberg
dc.relation.ispartofLecture Notes in Computer Scienceen_US
dc.rights© Springer-Verlag Berlin Heidelberg 2006
dc.subject.otherComputation by Abstract Devicesen_US
dc.subject.otherPattern Recognitionen_US
dc.subject.otherInformation Systems Applicationsen_US
dc.subject.otherDatabase Managementen_US
dc.titleImproved storage capacity of hebbian learning attractor neural network with bump formationsen_US
dc.typeconferenceObjecten
dc.typebookParten
dc.subject.ecienciaInformáticaes_ES
dc.relation.publisherversionhttp://dx.doi.org/10.1007/11840817_25
dc.identifier.doi10.1007/11840817_25
dc.identifier.publicationfirstpage234
dc.identifier.publicationlastpage243
dc.identifier.publicationvolume4131
dc.relation.eventdateSeptember 10-14, 2006en_US
dc.relation.eventnumber16
dc.relation.eventplaceAthens (Greece)en_US
dc.relation.eventtitle16th International Conference on Artificial Neural Networks, ICANN 2006en_US
dc.type.versioninfo:eu-repo/semantics/acceptedVersionen
dc.contributor.groupAprendizaje Automático (ING EPS-001)es_ES
dc.rights.accessRightsopenAccessen
dc.facultadUAMEscuela Politécnica Superior


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record