Mutual information and topology 1: Asymmetric neural network
Entity
UAM. Departamento de Ingeniería InformáticaPublisher
Springer Berlin HeidelbergDate
2004Citation
10.1007/978-3-540-28647-9_3
Advances in Neural Networks – ISNN 2004: International Symposium on Neural Networks, Dalian, China, August 2004, Proceedings, Part I. Lecture Notes in Computer Science, Volumen 3173. Springer 2004. 14-19
ISSN
0302-9743 (print); 1611-3349 (online)ISBN
978-3-540-22841-7 (print); 978-3-540-28647-9 (online)DOI
10.1007/978-3-540-28647-9_3Funded by
Supported by MCyT-Spain BFI-2003-07276 and TIC 2002-572-C02Editor's Version
http://dx.doi.org/10.1007/978-3-540-28647-9_3Subjects
Computation by Abstract Devices; Programming Techniques; Algorithm Analysis and Problem Complexity; Artificial Intelligence; Computer Communication Networks; Discrete Mathematics in Computer Science; InformáticaNote
Proceedings of International Symposium on Neural Networks, Dalian, China, August 2004The final publication is available at Springer via http://dx.doi.org/10.1007/978-3-540-28647-9_3
Rights
© Springer-Verlag Berlin Heidelberg 2004Abstract
An infinite range neural network works as an associative memory device if both the learning storage and attractor abilities are large enough. This work deals with the search of an optimal topology, varying the (small-world) parameters: the average connectivity γ ranges from the fully linked to a extremely diluted network; the randomness ω ranges from purely neighbor links to a completely random network. The network capacity is measured by the mutual information, MI, between patterns and retrieval states. It is found that MI is optimized at a certain value γ o for a given 0 < ω< 1 if the network is asymmetric.
Files in this item
Google Scholar:Domínguez Carreta, David Renato
-
Nedeltchev Koroutchev, Kostadin
-
Serrano Jerez, Eduardo
-
Rodríguez Ortiz, Francisco Borja
This item appears in the following Collection(s)
Related items
Showing items related by title, author, creator and subject.