Importance Weighted Adversarial Variational Bayes
Entity
UAM. Departamento de Ingeniería InformáticaPublisher
Springer NatureDate
2020-11-04Citation
10.1007/978-3-030-61705-9_31
Hybrid Artificial Intelligent Systems, HAIS. Lecture Notes in Computer Science, Volume 12344. Springer, 2020. 374-386
ISBN
978-3-030-61705-9DOI
10.1007/978-3-030-61705-9_31Funded by
We acknowledge the use of the facilities of Centro de Computacion Cientıfica at UAM and support from the Spanish Plan Nacional I+D+i (grants TIN2016–76406-P, TEC2016–81900-REDT and PID2019–106827GB-I00) and from Comunidad de Madrid (grant PEJ-2017-AI TIC-6464)Project
Gobierno de España. TIN2016–76406-P; Gobierno de España. TEC2016–81900-REDT; Gobierno de España. PID2019–106827GB-I00; Comunidad de Madrid. PEJ-2017-AI/TIC-6464Editor's Version
https://doi.org/10.1007/978-3-030-61705-9_31Subjects
Adversarial variational bayes; Generative models; Importance weighted autoencoder; Variational autoencoder; InformáticaNote
This version of the article has been accepted for publication, after peer review (when applicable) and is subject to Springer Nature’s AM terms of use, but is not the Version of Record and does not reflect post-acceptance improvements, or any corrections. The Version of Record is available online at: DOI: 10.1007/978-3-030-61705-9_31Rights
© 2020 Springer Nature Switzerland AGAbstract
Adversarial variational Bayes (AVB) can infer the parameters of a generative model from the data using approximate maximum likelihood. The likelihood of deep generative models model is intractable. However, it can be approximated by a lower bound obtained in terms of an approximate posterior distribution of the latent variables of the data q. The closer q is to the actual posterior, the tighter the lower bound is. Therefore, by maximizing the lower bound one should expect to also maximize the likelihood. Traditionally, the approximate distribution q is Gaussian. AVB relaxes this limitation and allows for flexible distributions that may lack a closed-form probability density function. Implicit distributions obtained by letting a source of Gaussian noise go through a deep neural network are examples of these distributions. Here, we combine AVB with the importance weighted autoencoder, a technique that has been shown to provide a tighter lower bound on the marginal likelihood. This is expected to lead to a more accurate parameter estimation of the generative model via approximate maximum likelihood. We have evaluated the proposed method on three datasets, MNIST, Fashion MNIST, and Omniglot. The experiments show that the proposed method improves the test log-likelihood of a generative model trained using AVB
Files in this item
Google Scholar:Gómez-Sancho, Marta
-
Hernández Lobato, Daniel
This item appears in the following Collection(s)
Related items
Showing items related by title, author, creator and subject.
-
Leukocyte profile variation in Dupont’s Lark (Chersophilus duponti) in Spain and Morocco
Bustillo de la Rosa, Daniel; Calero-Riestra, María; Pérez Granados, Cristian; Mereu, Silvia; Morales Prieto, Manuel Borja
; Traba Díaz, Juan
; López-Iborra, Germán M.; Barrero Diego, Adrián; Gómez Catasus, Julia
; Reverter Cid, Margarita
; Viñuela, Javier; Oñate Rubalcaba, Juan José
; Hervás Bengoechea, Israel; Hernández Justribó, Jorge; García, Jesús T.
2021-12-26