A novel piecewise linear classifier based on polyhedral conic and max-min separabilities
- Authors: Bagirov, Adil , Ugon, Julien , Webb, Dean , Ozturk, Gurkan , Kasimbeyli, Refail
- Date: 2011
- Type: Text , Journal article
- Relation: TOP Vol.21, no.1 (2011), p. 1-22
- Full Text: false
- Reviewed:
- Description: In this paper, an algorithm for finding piecewise linear boundaries between pattern classes is developed. This algorithm consists of two main stages. In the first stage, a polyhedral conic set is used to identify data points which lie inside their classes, and in the second stage we exclude those points to compute a piecewise linear boundary using the remaining data points. Piecewise linear boundaries are computed incrementally starting with one hyperplane. Such an approach allows one to significantly reduce the computational effort in many large data sets. Results of numerical experiments are reported. These results demonstrate that the new algorithm consistently produces a good test set accuracy on most data sets comparing with a number of other mainstream classifiers. © 2011 Sociedad de EstadÃstica e Investigación Operativa.
An incremental piecewise linear classifier based on polyhedral conic separation
- Authors: Ozturk, Gurkan , Bagirov, Adil , Kasimbeyli, Refail
- Date: 2015
- Type: Text , Journal article
- Relation: Machine Learning Vol. 101, no. 1-3 (2015), p. 397-413
- Relation: http://purl.org/au-research/grants/arc/DP140103213
- Full Text: false
- Reviewed:
- Description: In this paper, a piecewise linear classifier based on polyhedral conic separation is developed. This classifier builds nonlinear boundaries between classes using polyhedral conic functions. Since the number of polyhedral conic functions separating classes is not known a priori, an incremental approach is proposed to build separating functions. These functions are found by minimizing an error function which is nonsmooth and nonconvex. A special procedure is proposed to generate starting points to minimize the error function and this procedure is based on the incremental approach. The discrete gradient method, which is a derivative-free method for nonsmooth optimization, is applied to minimize the error function starting from those points. The proposed classifier is applied to solve classification problems on 12 publicly available data sets and compared with some mainstream and piecewise linear classifiers. © 2014, The Author(s).
An incremental clustering algorithm based on hyperbolic smoothing
- Authors: Bagirov, Adil , Ordin, Burak , Ozturk, Gurkan , Xavier, Adilson
- Date: 2015
- Type: Text , Journal article
- Relation: Computational Optimization and Applications Vol. 61, no. 1 (2015), p. 219-241
- Relation: http://purl.org/au-research/grants/arc/DP140103213
- Full Text: false
- Reviewed:
- Description: Clustering is an important problem in data mining. It can be formulated as a nonsmooth, nonconvex optimization problem. For the most global optimization techniques this problem is challenging even in medium size data sets. In this paper, we propose an approach that allows one to apply local methods of smooth optimization to solve the clustering problems. We apply an incremental approach to generate starting points for cluster centers which enables us to deal with nonconvexity of the problem. The hyperbolic smoothing technique is applied to handle nonsmoothness of the clustering problems and to make it possible application of smooth optimization algorithms to solve them. Results of numerical experiments with eleven real-world data sets and the comparison with state-of-the-art incremental clustering algorithms demonstrate that the smooth optimization algorithms in combination with the incremental approach are powerful alternative to existing clustering algorithms.
Piecewise linear classifiers based on nonsmooth optimization approaches
- Authors: Bagirov, Adil , Kasimbeyli, Refail , Ozturk, Gurkan , Ugon, Julien
- Date: 2014
- Type: Text , Book chapter
- Relation: Optimization in Science and Engineering p. 1-32
- Full Text: false
- Reviewed:
- Description: Nonsmooth optimization provides efficient algorithms for solving many machine learning problems. In particular, nonsmooth optimization approaches to supervised data classification problems lead to the design of very efficient algorithms for their solution. In this chapter, we demonstrate how nonsmooth optimization algorithms can be applied to design efficient piecewise linear classifiers for supervised data classification problems. Such classifiers are developed using a max–min and a polyhedral conic separabilities as well as an incremental approach. We report results of numerical experiments and compare the piecewise linear classifiers with a number of other mainstream classifiers.
A sharp augmented Lagrangian-based method in constrained non-convex optimization
- Authors: Bagirov, Adil , Ozturk, Gurkan , Kasimbeyli, Refail
- Date: 2019
- Type: Text , Journal article
- Relation: Optimization Methods and Software Vol. 34, no. 3 (2019), p. 462-488
- Full Text: false
- Reviewed:
- Description: In this paper, a novel sharp Augmented Lagrangian-based global optimization method is developed for solving constrained non-convex optimization problems. The algorithm consists of outer and inner loops. At each inner iteration, the discrete gradient method is applied to minimize the sharp augmented Lagrangian function. Depending on the solution found the algorithm stops or updates the dual variables in the inner loop, or updates the upper or lower bounds by going to the outer loop. The convergence results for the proposed method are presented. The performance of the method is demonstrated using a wide range of nonlinear smooth and non-smooth constrained optimization test problems from the literature.