A sharp augmented Lagrangian-based method in constrained non-convex optimization
- Authors: Bagirov, Adil , Ozturk, Gurkan , Kasimbeyli, Refail
- Date: 2019
- Type: Text , Journal article
- Relation: Optimization Methods and Software Vol. 34, no. 3 (2019), p. 462-488
- Full Text: false
- Reviewed:
- Description: In this paper, a novel sharp Augmented Lagrangian-based global optimization method is developed for solving constrained non-convex optimization problems. The algorithm consists of outer and inner loops. At each inner iteration, the discrete gradient method is applied to minimize the sharp augmented Lagrangian function. Depending on the solution found the algorithm stops or updates the dual variables in the inner loop, or updates the upper or lower bounds by going to the outer loop. The convergence results for the proposed method are presented. The performance of the method is demonstrated using a wide range of nonlinear smooth and non-smooth constrained optimization test problems from the literature.
Nonsmooth optimization algorithm for solving clusterwise linear regression problems
- Authors: Bagirov, Adil , Ugon, Julien , Mirzayeva, Hijran
- Date: 2015
- Type: Text , Journal article
- Relation: Journal of Optimization Theory and Applications Vol. 164, no. 3 (2015), p. 755-780
- Relation: http://purl.org/au-research/grants/arc/DP140103213
- Full Text: false
- Reviewed:
- Description: Clusterwise linear regression consists of finding a number of linear regression functions each approximating a subset of the data. In this paper, the clusterwise linear regression problem is formulated as a nonsmooth nonconvex optimization problem and an algorithm based on an incremental approach and on the discrete gradient method of nonsmooth optimization is designed to solve it. This algorithm incrementally divides the whole dataset into groups which can be easily approximated by one linear regression function. A special procedure is introduced to generate good starting points for solving global optimization problems at each iteration of the incremental algorithm. The algorithm is compared with the multi-start Spath and the incremental algorithms on several publicly available datasets for regression analysis.
Cutting angle method and a local search
- Authors: Bagirov, Adil , Rubinov, Alex
- Date: 2003
- Type: Text , Journal article
- Relation: Journal of Global Optimization Vol. 27, no. 2-3 (Nov 2003), p. 193-213
- Full Text: false
- Reviewed:
- Description: The paper deals with combinations of the cutting angle method in global optimization and a local search. We propose to use special transformed objective functions for each intermediate use of the cutting angle method. We report results of numerical experiments which demonstrate that the proposed approach is very beneficial in the search for a global minimum.
- Description: C1
- Description: 2003000438