Please use this identifier to cite or link to this item:
https://repository.iimb.ac.in/handle/2074/11207
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Gaudioso, Manlio | |
dc.contributor.author | Gorgone, Enrico | |
dc.contributor.author | Labb, Martine | |
dc.contributor.author | Rodrguez-Cha, Antonio M | |
dc.date.accessioned | 2020-03-31T13:08:13Z | - |
dc.date.available | 2020-03-31T13:08:13Z | - |
dc.date.issued | 2017 | |
dc.identifier.issn | 0305-0548 | |
dc.identifier.uri | https://repository.iimb.ac.in/handle/2074/11207 | - |
dc.description.abstract | We discuss a Lagrangian-relaxation-based heuristics for dealing with feature selection in the Support Vector Machine (SVM) framework for binary classification. In particular we embed into our objective function a weighted combination of the L1 and L0 norm of the normal to the separating hyperplane. We come out with a Mixed Binary Linear Programming problem which is suitable for a Lagrangian relaxation approach. Based on a property of the optimal multiplier setting, we apply a consolidated nonsmooth optimization ascent algorithm to solve the resulting Lagrangian dual. In the proposed approach we get, at every ascent step, both a lower bound on the optimal solution as well as a feasible solution at low computational cost. We present the results of our numerical experiments on some benchmark datasets. | |
dc.publisher | Elsevier | |
dc.subject | Feature Selection | |
dc.subject | Lagrangian Relaxation | |
dc.subject | Nonsmooth Optimization | |
dc.subject | SVM Classification | |
dc.title | Lagrangian relaxation for SVM feature selection | |
dc.type | Journal Article | |
dc.identifier.doi | 10.1016/J.COR.2017.06.001 | |
dc.pages | 137-145p. | |
dc.vol.no | Vol.87 | - |
dc.journal.name | Computers and Operations Research | |
Appears in Collections: | 2010-2019 |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.