Learning feature selection dependencies in multi-task learning
EntityUAM. Departamento de Ingeniería Informática
Citation27th Annual Conference on Neural Information Processing Systems, held in Lake Tahoe on 2013. 1-9
NoteThis is an electronic version of the paper presented at the 27 Annual Conference on Neural Information Processing Systems, held in Lake Tahoe on 2013
Rights© Los autores
A probabilistic model based on the horseshoe prior is proposed for learning dependencies in the process of identifying relevant features for prediction. Exact inference is intractable in this model. However, expectation propagation offers an approximate alternative. Because the process of estimating feature selection dependencies may suffer from over-fitting in the model proposed, additional data from a multi-task learning scenario are considered for induction. The same model can be used in this setting with few modifications. Furthermore, the assumptions made are less restrictive than in other multi-task methods: The different tasks must share feature selection dependencies, but can have different relevant features and model coefficients. Experiments with real and synthetic data show that this model performs better than other multi-task alternatives from the literature. The experiments also show that the model is able to induce suitable feature selection dependencies for the problems considered, only from the training data.
This item appears in the following Collection(s)
Showing items related by title, author, creator and subject.