Modified Frank–Wolfe algorithm for enhanced sparsity in support vector machine classifiers
EntityUAM. Departamento de Ingeniería Informática
10.1016/j.neucom.2018.08.049Neurocomputing 320 (2018): 47-59
Funded byThe authors would like to thank the following organizations. • EU: The research leading to these results has received funding from the European Research Council under the European Ue- DATADRIVE-B (290923). This paper reflects only the authors’ views, the Union is not liable for any use that may be made of the contained information. • Research Council KUL: GOA/10/09 MaNet, CoE PFV/10/002 (OPTEC), BIL12/11T
SubjectsFrank–Wolfe; Lasso; Sparsity; Support Vector Machines; Informática
Rights© 2018 Elsevier B.V.
Esta obra está bajo una licencia de Creative Commons Reconocimiento-NoComercial-SinObraDerivada 4.0 Internacional.
This work proposes a new algorithm for training a re-weighted ℓ2 Support Vector Machine (SVM), inspired on the re-weighted Lasso algorithm of Candès et al. and on the equivalence between Lasso and SVM shown recently by Jaggi. In particular, the margin required for each training vector is set independently, defining a new weighted SVM model. These weights are selected to be binary, and they are automatically adapted during the training of the model, resulting in a variation of the Frank–Wolfe optimization algorithm with essentially the same computational complexity as the original algorithm. As shown experimentally, this algorithm is computationally cheaper to apply since it requires less iterations to converge, and it produces models with a sparser representation in terms of support vectors and which are more stable with respect to the selection of the regularization hyper-parameter
This item appears in the following Collection(s)
Showing items related by title, author, creator and subject.