Classification of functional data: a comparative study
Entity
UAM. Departamento de Ingeniería Informática; UAM. Departamento de MatemáticasPublisher
IEEEDate
2023-03-23Citation
10.1109/ICMLA55696.2022.00143
21st IEEE International Conference on Machine Learning and Applications (ICMLA), Nassau, Bahamas, 2022
ISBN
978-1-6654-6283-9DOI
10.1109/ICMLA55696.2022.00143Funded by
The authors acknowledge financial support from the Spanish Ministry of Education and Innovation, projects PID2019- 106827GB-I00 / AEI / 10.13039/501100011033 and PID2019- 109387GB-I00, and from grant FPU18/00047. We acknowledge the computer resources provided by CCC-UAMProject
Gobierno de España. PID2019-106827GB-I00; Gobierno de España. PID2019-109387GB-I00Editor's Version
https://doi.org/10.1109/ICMLA55696.2022.00143Subjects
classification; functional data analysis; functional k-NN; Mahalanobis distance; InformáticaRights
© 2022 IEEEAbstract
In functional classification problems the data available for learning are characterized by functions, rather than vectors of attributes. In consequence, multivariate classifiers need to be adapted, and new types of classifiers designed to take into account the special characteristics of these types of data. In this work, an empirical evaluation of different classification methods is carried out using a variety of functional classification problems from different areas of application. The classifiers considered include nearest centroids with functional means as class prototypes and functional distances, standard multivariate classifiers used in combination with a variable selection method, classifiers based on the notion of functional depth, a functional version of k-nearest neighbors (k-NN), and random forest. From the results of this comparative study one concludes that random forest is among the best off-the-shelf classifiers not only for multivariate but also for functional classification problems. The variable selection method used in combination with a quadratic discriminant has fairly good overall accuracy using only a small set of impact points. This dimensionality reduction leads to improvements both in efficiency and interpretability. Finally, a functional version of k-NN that uses the α-Mahalanobis distance exhibits consistently good predictive performance in all the problems considered. This robustness makes k-NN a good benchmark for functional classification
Files in this item
Google Scholar:Ramos Carreño, Carlos
-
Torrecilla Noguerales, José Luis
-
Suárez González, Alberto
This item appears in the following Collection(s)
Related items
Showing items related by title, author, creator and subject.