Show simple item record

dc.contributor.authorMartínez Muñoz, Gonzalo 
dc.contributor.authorSuárez González, Alberto 
dc.contributor.otherUAM. Departamento de Ingeniería Informáticaes_ES
dc.date.accessioned2015-02-26T18:31:16Z
dc.date.available2015-02-26T18:31:16Z
dc.date.issued2007-01-01
dc.identifier.citationPattern Recognition Letters 28.1 (2007): 156 – 165en_US
dc.identifier.issn0167-8655 (print)en_US
dc.identifier.issn1872-7344 (online)en_US
dc.identifier.urihttp://hdl.handle.net/10486/664134
dc.descriptionThis is the author’s version of a work that was accepted for publication in Pattern Recognition Letters. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Pattern Recognition Letters 28.1 (2007): 156 – 165, DOI: 10.1016/j.patrec.2006.06.018en_US
dc.description.abstractBoosting is used to determine the order in which classifiers are aggregated in a bagging ensemble. Early stopping in the aggregation of the classifiers in the ordered bagging ensemble allows the identification of subensembles that require less memory for storage, classify faster and can improve the generalization accuracy of the original bagging ensemble. In all the classification problems investigated pruned ensembles with 20 % of the original classifiers show statistically significant improvements over bagging. In problems where boosting is superior to bagging, these improvements are not sufficient to reach the accuracy of the corresponding boosting ensembles. However, ensemble pruning preserves the performance of bagging in noisy classification tasks, where boosting often has larger generalization errors. Therefore, pruned bagging should generally be preferred to complete bagging and, if no information about the level of noise is available, it is a robust alternative to AdaBoost.en_US
dc.description.sponsorshipThe authors acknowledge financial support from the Spanish Dirección General de Investigación, project TIN2004-07676-C02-02.en_US
dc.format.extent20 pág.es_ES
dc.format.mimetypeapplication/pdfen
dc.language.isoengen
dc.publisherElsevier BV
dc.relation.ispartofPattern Recognition Lettersen_US
dc.rights© 2007 Elsevier B.V. All rights reserveden_US
dc.subject.otherBaggingen_US
dc.subject.otherBoostingen_US
dc.subject.otherDecision treesen_US
dc.subject.otherEnsemble pruningen_US
dc.subject.otherEnsemblesen_US
dc.subject.otherMachine learningen_US
dc.titleUsing boosting to prune bagging ensemblesen_US
dc.typearticleen_US
dc.subject.ecienciaInformáticaes_ES
dc.relation.publisherversionhttp://dx.doi.org/10.1016/j.patrec.2006.06.018
dc.identifier.doi10.1016/j.patrec.2006.06.018
dc.identifier.publicationfirstpage156
dc.identifier.publicationissue1
dc.identifier.publicationlastpage165
dc.identifier.publicationvolume28
dc.type.versioninfo:eu-repo/semantics/acceptedVersionen
dc.contributor.groupAprendizaje Automático (ING EPS-001)es_ES
dc.rights.ccReconocimiento – NoComercial – SinObraDerivadaes_ES
dc.rights.accessRightsopenAccessen
dc.facultadUAMEscuela Politécnica Superior


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record