Please use this identifier to cite or link to this item:
https://repository.iimb.ac.in/handle/2074/10805
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Frangioni, Antonio | |
dc.contributor.author | Gendron, Bernard | |
dc.contributor.author | Gorgone, Enrico | |
dc.date.accessioned | 2020-03-12T11:55:17Z | - |
dc.date.available | 2020-03-12T11:55:17Z | - |
dc.date.issued | 2018 | |
dc.identifier.issn | 1862-4472 | |
dc.identifier.uri | https://repository.iimb.ac.in/handle/2074/10805 | - |
dc.description.abstract | We present and computationally evaluate a variant of the fast gradient method by Nesterov that is capable of exploiting information, even if approximate, about the optimal value of the problem. This information is available in some applications, among which the computation of bounds for hard integer programs. We show that dynamically changing the smoothness parameter of the algorithm using this information results in a better convergence profile of the algorithm in practice. | |
dc.publisher | Springer | |
dc.subject | Fast gradient method | |
dc.subject | Lagrangian relaxation | |
dc.subject | Convex optimization | |
dc.title | Dynamic smoothness parameter for fast gradient methods | |
dc.type | Journal Article | |
dc.identifier.doi | https://doi.org/10.1007/S11590-017-1168-Z | |
dc.pages | 43-53p. | |
dc.vol.no | Vol.12 | - |
dc.issue.no | Iss.1 | - |
dc.journal.name | Optimization Letters | |
Appears in Collections: | 2010-2019 |
Files in This Item:
File | Size | Format | |
---|---|---|---|
Gorgone_OL_2018_Vol.12_Iss.1.pdf | 861.12 kB | Adobe PDF | View/Open Request a copy |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.