A note on primal-dual stability in infinite linear programming
- Authors: Goberna, Miguel , López, Marco , Ridolfi, Andrea , Vera de Serio, Virginia
- Date: 2020
- Type: Text , Journal article
- Relation: Optimization Letters Vol. 14, no. 8 (2020), p. 2247-2263
- Full Text: false
- Reviewed:
- Description: In this note we analyze the simultaneous preservation of the consistency (and of the inconsistency) of linear programming problems posed in infinite dimensional Banach spaces, and their corresponding dual problems, under sufficiently small perturbations of the data. We consider seven different scenarios associated with the different possibilities of perturbations of the data (the objective functional, the constraint functionals, and the right hand-side function), i.e., which of them are known, and remain fixed, and which ones can be perturbed because of their uncertainty. The obtained results allow us to give sufficient and necessary conditions for the coincidence of the optimal values of both problems and for the stability of the duality gap under the same type of perturbations. There appear substantial differences with the finite dimensional case due to the distinct topological properties of cones in finite and infinite dimensional Banach spaces. © 2020, Springer-Verlag GmbH Germany, part of Springer Nature.
- Description: Funding details: Australian Research Council, ARC, DP180100602: http://purl.org/au-research/grants/arc/DP180100602
Aggregate subgradient method for nonsmooth DC optimization
- Authors: Bagirov, Adil , Taheri, Sona , Joki, Kaisa , Karmitsa, Napsu , Mäkelä, Marko
- Date: 2021
- Type: Text , Journal article
- Relation: Optimization Letters Vol. 15, no. 1 (2021), p. 83-96
- Relation: http://purl.org/au-research/grants/arc/DP190100580
- Full Text:
- Reviewed:
- Description: The aggregate subgradient method is developed for solving unconstrained nonsmooth difference of convex (DC) optimization problems. The proposed method shares some similarities with both the subgradient and the bundle methods. Aggregate subgradients are defined as a convex combination of subgradients computed at null steps between two serious steps. At each iteration search directions are found using only two subgradients: the aggregate subgradient and a subgradient computed at the current null step. It is proved that the proposed method converges to a critical point of the DC optimization problem and also that the number of null steps between two serious steps is finite. The new method is tested using some academic test problems and compared with several other nonsmooth DC optimization solvers. © 2020, Springer-Verlag GmbH Germany, part of Springer Nature.