dc.contributor.author | Martínez Muñoz, Gonzalo | |
dc.contributor.author | Suárez González, Alberto | |
dc.contributor.other | UAM. Departamento de Ingeniería Informática | es_ES |
dc.date.accessioned | 2015-02-26T18:31:16Z | |
dc.date.available | 2015-02-26T18:31:16Z | |
dc.date.issued | 2007-01-01 | |
dc.identifier.citation | Pattern Recognition Letters 28.1 (2007): 156 – 165 | en_US |
dc.identifier.issn | 0167-8655 (print) | en_US |
dc.identifier.issn | 1872-7344 (online) | en_US |
dc.identifier.uri | http://hdl.handle.net/10486/664134 | |
dc.description | This is the author’s version of a work that was accepted for publication in Pattern Recognition Letters. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Pattern Recognition Letters 28.1 (2007): 156 – 165, DOI: 10.1016/j.patrec.2006.06.018 | en_US |
dc.description.abstract | Boosting is used to determine the order in which classifiers are aggregated in a
bagging ensemble. Early stopping in the aggregation of the classifiers in the ordered
bagging ensemble allows the identification of subensembles that require less memory
for storage, classify faster and can improve the generalization accuracy of the original
bagging ensemble. In all the classification problems investigated pruned ensembles
with 20 % of the original classifiers show statistically significant improvements over
bagging. In problems where boosting is superior to bagging, these improvements
are not sufficient to reach the accuracy of the corresponding boosting ensembles.
However, ensemble pruning preserves the performance of bagging in noisy classification
tasks, where boosting often has larger generalization errors. Therefore, pruned
bagging should generally be preferred to complete bagging and, if no information
about the level of noise is available, it is a robust alternative to AdaBoost. | en_US |
dc.description.sponsorship | The authors acknowledge financial support from the Spanish Dirección General de Investigación, project TIN2004-07676-C02-02. | en_US |
dc.format.extent | 20 pág. | es_ES |
dc.format.mimetype | application/pdf | en |
dc.language.iso | eng | en |
dc.publisher | Elsevier BV | |
dc.relation.ispartof | Pattern Recognition Letters | en_US |
dc.rights | © 2007 Elsevier B.V. All rights reserved | en_US |
dc.subject.other | Bagging | en_US |
dc.subject.other | Boosting | en_US |
dc.subject.other | Decision trees | en_US |
dc.subject.other | Ensemble pruning | en_US |
dc.subject.other | Ensembles | en_US |
dc.subject.other | Machine learning | en_US |
dc.title | Using boosting to prune bagging ensembles | en_US |
dc.type | article | en_US |
dc.subject.eciencia | Informática | es_ES |
dc.relation.publisherversion | http://dx.doi.org/10.1016/j.patrec.2006.06.018 | |
dc.identifier.doi | 10.1016/j.patrec.2006.06.018 | |
dc.identifier.publicationfirstpage | 156 | |
dc.identifier.publicationissue | 1 | |
dc.identifier.publicationlastpage | 165 | |
dc.identifier.publicationvolume | 28 | |
dc.type.version | info:eu-repo/semantics/acceptedVersion | en |
dc.contributor.group | Aprendizaje Automático (ING EPS-001) | es_ES |
dc.rights.cc | Reconocimiento – NoComercial – SinObraDerivada | es_ES |
dc.rights.accessRights | openAccess | en |
dc.facultadUAM | Escuela Politécnica Superior | |