Mutual information and topology 1: Asymmetric neural network
EntityUAM. Departamento de Ingeniería Informática
PublisherSpringer Berlin Heidelberg
10.1007/978-3-540-28647-9_3Advances in Neural Networks – ISNN 2004: International Symposium on Neural Networks, Dalian, China, August 2004, Proceedings, Part I. Lecture Notes in Computer Science, Volumen 3173. Springer 2004. 14-19
ISSN0302-9743 (print); 1611-3349 (online)
ISBN978-3-540-22841-7 (print); 978-3-540-28647-9 (online)
Funded bySupported by MCyT-Spain BFI-2003-07276 and TIC 2002-572-C02
SubjectsComputation by Abstract Devices; Programming Techniques; Algorithm Analysis and Problem Complexity; Artificial Intelligence; Computer Communication Networks; Discrete Mathematics in Computer Science; Informática
NoteProceedings of International Symposium on Neural Networks, Dalian, China, August 2004
The final publication is available at Springer via http://dx.doi.org/10.1007/978-3-540-28647-9_3
Rights© Springer-Verlag Berlin Heidelberg 2004
An infinite range neural network works as an associative memory device if both the learning storage and attractor abilities are large enough. This work deals with the search of an optimal topology, varying the (small-world) parameters: the average connectivity γ ranges from the fully linked to a extremely diluted network; the randomness ω ranges from purely neighbor links to a completely random network. The network capacity is measured by the mutual information, MI, between patterns and retrieval states. It is found that MI is optimized at a certain value γ o for a given 0 < ω< 1 if the network is asymmetric.
Google Scholar:Domínguez Carreta, David Renato - Nedeltchev Koroutchev, Kostadin - Serrano Jerez, Eduardo - Rodríguez Ortiz, Francisco Borja
This item appears in the following Collection(s)
Showing items related by title, author, creator and subject.