A sharp augmented Lagrangian-based method in constrained non-convex optimization
- Authors: Bagirov, Adil , Ozturk, Gurkan , Kasimbeyli, Refail
- Date: 2019
- Type: Text , Journal article
- Relation: Optimization Methods and Software Vol. 34, no. 3 (2019), p. 462-488
- Full Text: false
- Reviewed:
- Description: In this paper, a novel sharp Augmented Lagrangian-based global optimization method is developed for solving constrained non-convex optimization problems. The algorithm consists of outer and inner loops. At each inner iteration, the discrete gradient method is applied to minimize the sharp augmented Lagrangian function. Depending on the solution found the algorithm stops or updates the dual variables in the inner loop, or updates the upper or lower bounds by going to the outer loop. The convergence results for the proposed method are presented. The performance of the method is demonstrated using a wide range of nonlinear smooth and non-smooth constrained optimization test problems from the literature.
An incremental piecewise linear classifier based on polyhedral conic separation
- Authors: Ozturk, Gurkan , Bagirov, Adil , Kasimbeyli, Refail
- Date: 2015
- Type: Text , Journal article
- Relation: Machine Learning Vol. 101, no. 1-3 (2015), p. 397-413
- Relation: http://purl.org/au-research/grants/arc/DP140103213
- Full Text: false
- Reviewed:
- Description: In this paper, a piecewise linear classifier based on polyhedral conic separation is developed. This classifier builds nonlinear boundaries between classes using polyhedral conic functions. Since the number of polyhedral conic functions separating classes is not known a priori, an incremental approach is proposed to build separating functions. These functions are found by minimizing an error function which is nonsmooth and nonconvex. A special procedure is proposed to generate starting points to minimize the error function and this procedure is based on the incremental approach. The discrete gradient method, which is a derivative-free method for nonsmooth optimization, is applied to minimize the error function starting from those points. The proposed classifier is applied to solve classification problems on 12 publicly available data sets and compared with some mainstream and piecewise linear classifiers. © 2014, The Author(s).
Nonsmooth optimization algorithm for solving clusterwise linear regression problems
- Authors: Bagirov, Adil , Ugon, Julien , Mirzayeva, Hijran
- Date: 2015
- Type: Text , Journal article
- Relation: Journal of Optimization Theory and Applications Vol. 164, no. 3 (2015), p. 755-780
- Relation: http://purl.org/au-research/grants/arc/DP140103213
- Full Text: false
- Reviewed:
- Description: Clusterwise linear regression consists of finding a number of linear regression functions each approximating a subset of the data. In this paper, the clusterwise linear regression problem is formulated as a nonsmooth nonconvex optimization problem and an algorithm based on an incremental approach and on the discrete gradient method of nonsmooth optimization is designed to solve it. This algorithm incrementally divides the whole dataset into groups which can be easily approximated by one linear regression function. A special procedure is introduced to generate good starting points for solving global optimization problems at each iteration of the incremental algorithm. The algorithm is compared with the multi-start Spath and the incremental algorithms on several publicly available datasets for regression analysis.
Piecewise partially separable functions and a derivative-free algorithm for large scale nonsmooth optimization
- Authors: Bagirov, Adil , Ugon, Julien
- Date: 2006
- Type: Text , Journal article
- Relation: Journal of Global Optimization Vol. 35, no. 2 (Jun 2006), p. 163-195
- Full Text:
- Reviewed:
- Description: This paper introduces the notion of piecewise partially separable functions and studies their properties. We also consider some of many applications of these functions. Finally, we consider the problem of minimizing of piecewise partially separable functions and develop an algorithm for its solution. This algorithm exploits the structure of such functions. We present the results of preliminary numerical experiments.
- Description: 2003001532
An algorithm for minimizing clustering functions
- Authors: Bagirov, Adil , Ugon, Julien
- Date: 2005
- Type: Text , Journal article
- Relation: Optimization Vol. 54, no. 4-5 (Aug-Oct 2005), p. 351-368
- Full Text:
- Reviewed:
- Description: The problem of cluster analysis is formulated as a problem of nonsmooth, nonconvex optimization. An algorithm for solving the latter optimization problem is developed which allows one to significantly reduce the computational efforts. This algorithm is based on the so-called discrete gradient method. Results of numerical experiments are presented which demonstrate the effectiveness of the proposed algorithm.
- Description: C1
- Description: 2003001266
A hybrid neural learning algorithm combining evolutionary algorithm with discrete gradient method
- Authors: Ghosh, Ranadhir , Yearwood, John , Bagirov, Adil
- Date: 2004
- Type: Text , Conference paper
- Relation: Paper presented at the Second International Conference on Software Computing and Intelligent Systems, Yokahama, Japan : 21st October, 2004
- Full Text: false
- Reviewed:
- Description: E1
- Description: 2003000860
Cutting angle method and a local search
- Authors: Bagirov, Adil , Rubinov, Alex
- Date: 2003
- Type: Text , Journal article
- Relation: Journal of Global Optimization Vol. 27, no. 2-3 (Nov 2003), p. 193-213
- Full Text: false
- Reviewed:
- Description: The paper deals with combinations of the cutting angle method in global optimization and a local search. We propose to use special transformed objective functions for each intermediate use of the cutting angle method. We report results of numerical experiments which demonstrate that the proposed approach is very beneficial in the search for a global minimum.
- Description: C1
- Description: 2003000438