Minimizing nonsmooth DC functions via successive DC piecewise-affine approximations
- Authors: Gaudioso, Manlio , Giallombardo, Giovanni , Miglionico, Giovanna , Bagirov, Adil
- Date: 2018
- Type: Text , Journal article
- Relation: Journal of Global Optimization Vol. 71, no. 1 (2018), p. 37-55
- Full Text: false
- Reviewed:
- Description: We introduce a proximal bundle method for the numerical minimization of a nonsmooth difference-of-convex (DC) function. Exploiting some classic ideas coming from cutting-plane approaches for the convex case, we iteratively build two separate piecewise-affine approximations of the component functions, grouping the corresponding information in two separate bundles. In the bundle of the first component, only information related to points close to the current iterate are maintained, while the second bundle only refers to a global model of the corresponding component function. We combine the two convex piecewise-affine approximations, and generate a DC piecewise-affine model, which can also be seen as the pointwise maximum of several concave piecewise-affine functions. Such a nonconvex model is locally approximated by means of an auxiliary quadratic program, whose solution is used to certify approximate criticality or to generate a descent search-direction, along with a predicted reduction, that is next explored in a line-search setting. To improve the approximation properties at points that are far from the current iterate a supplementary quadratic program is also introduced to generate an alternative more promising search-direction. We discuss the main convergence issues of the line-search based proximal bundle method, and provide computational results on a set of academic benchmark test problems. © 2017, Springer Science+Business Media, LLC.
Introduction
- Authors: Bagirov, Adil , Gaudioso, Manlio , Karmitsa, Napsu , Mäkelä, Marko , Taheri, Sona
- Date: 2020
- Type: Text , Book chapter
- Relation: Numerical Nonsmooth Optimization: State of the Art Algorithms p. 1-16
- Full Text: false
- Reviewed:
- Description: Nonsmooth optimization (NSO) is among the most challenging tasks in the field of mathematical programming. It addresses optimization problems where objective and/or constraint functions have discontinuous gradients. NSO problems arise in many real life applications. Moreover, some smooth optimization techniques like different decomposition methods, dual formulations and exact penalty methods may lead us to solve NSO problems being either smaller in dimension or simpler in structure. In addition, some optimization problems may be analytically smooth but numerically nonsmooth. This is the case, for instance, with noisy input data and so-called stiff problems, which are numerically unstable and behave like nonsmooth problems. © Springer Nature Switzerland AG 2020.
Final words
- Authors: Bagirov, Adil , Gaudioso, Manlio , Karmitsa, Napsu , Mäkelä, Marko , Taheri, Sona
- Date: 2020
- Type: Text , Book chapter
- Relation: Numerical Nonsmooth Optimization: State of the Art Algorithms p. 693-694
- Full Text: false
- Reviewed: