Alpha-divergence minimization for deep Gaussian processes
EntityUAM. Departamento de Ingeniería Informática
10.1016/j.ijar.2022.08.003International Journal of Approximate Reasoning 150 (2022): 139-171
Funded byThe authors gratefully acknowledge the use of the facilities of Centro de Computación Científica (CCC) at Universidad Autónoma de Madrid. The authors also acknowledge financial support from Spanish Plan Nacional I+D+i, Ministerio de Ciencia e Innovación, grant PID2019-106827GB-I00 / AEI / 10.13039/501100011033
ProjectGobierno de España. PID2019-106827GB-I00
SubjectsDeep Gaussian processes; Expectation propagation; α-divergences; Approximate inference; Variational inference; Informática
Rights© 2022 The Author(s)
Esta obra está bajo una licencia de Creative Commons Reconocimiento-NoComercial-SinObraDerivada 4.0 Internacional.
This paper proposes the minimization of α-divergences for approximate inference in the context of deep Gaussian processes (DGPs). The proposed method can be considered as a generalization of variational inference (VI) and expectation propagation (EP), two previously used methods for approximate inference in DGPs. Both VI and EP are based on the minimization of the Kullback-Leibler divergence. The proposed method is based on a scalable version of power expectation propagation, a method that introduces an extra parameter α that specifies the targeted α-divergence to be optimized. In particular, such a method can recover the VI solution when α → 0 and the EP solution when α → 1. An exhaustive experimental evaluation shows that the minimization of α-divergences via the proposed method is feasible in DGPs and that choosing intermediate values of the α parameter between 0 and 1 can give better results in some problems. This means that one can improve the results of VI and EP when training DGPs. Importantly, the proposed method allows for stochastic optimization techniques, making it able to address datasets with several millions of instances
Files in this item
This item appears in the following Collection(s)
Showing items related by title, author, creator and subject.