Your selections:

6Ugon, Julien
4Karmitsa, Napsu
4Sultanova, Nargiz
4Yearwood, John
3Al Nuaimat, Alia
2Ahmed, S. T.
2Barton, Andrew
2Ganjehlou, Asef Nazari
2Joki, Kaisa
2Karasozen, Bulent
2Makela, Marko
2Mala-Jetmarova, Helena
2Mirzayeva, Hijran
2Ordin, Burak
2Rubinov, Alex
2Sukhorukova, Nadezda
2Taheri, Sona
1Al Nuaimat, A.
1Beliakov, Gleb

Show More

Show Less

200103 Numerical and Computational Mathematics
15Nonsmooth optimization
110802 Computation Theory and Mathematics
7Nonconvex optimization
4Cluster analysis
4DC programming
4Subdifferential
30906 Electrical and Electronic Engineering
3Algorithms
3Bundle method
3DC functions
3Smoothing techniques
2Bundle methods
2Classification
2Clusterwise regression
2Codifferential
2Cutting plane model
2Data mining
2Derivative-free optimization

Show More

Show Less

Format Type

Alexander Rubinov - An outstanding scholar

**Authors:**Bagirov, Adil**Date:**2010**Type:**Text , Journal article**Relation:**Pacific Journal of Optimization Vol. 6, no. 2, Suppl. 1 (2010), p. 203-209**Full Text:**false

An algorithm for clusterwise linear regression based on smoothing techniques

- Bagirov, Adil, Ugon, Julien, Mirzayeva, Hijran

**Authors:**Bagirov, Adil , Ugon, Julien , Mirzayeva, Hijran**Date:**2014**Type:**Text , Journal article**Relation:**Optimization Letters Vol. 9, no. 2 (2014), p. 375-390**Full Text:**false**Reviewed:****Description:**We propose an algorithm based on an incremental approach and smoothing techniques to solve clusterwise linear regression (CLR) problems. This algorithm incrementally divides the whole data set into groups which can be easily approximated by one linear regression function. A special procedure is introduced to generate an initial solution for solving global optimization problems at each iteration of the incremental algorithm. Such an approach allows one to find global or approximate global solutions to the CLR problems. The algorithm is tested using several data sets for regression analysis and compared with the multistart and incremental Spath algorithms.

Hyperbolic smoothing function method for minimax problems

- Bagirov, Adil, Al Nuaimat, Alia, Sultanova, Nargiz

**Authors:**Bagirov, Adil , Al Nuaimat, Alia , Sultanova, Nargiz**Date:**2013**Type:**Text , Journal article**Relation:**Optimization Vol. 62, no. 6 (2013), p. 759-782**Full Text:**false**Reviewed:****Description:**In this article, an approach for solving finite minimax problems is proposed. This approach is based on the use of hyperbolic smoothing functions. In order to apply the hyperbolic smoothing we reformulate the objective function in the minimax problem and study the relationship between the original minimax and reformulated problems. We also study main properties of the hyperbolic smoothing function. Based on these results an algorithm for solving the finite minimax problem is proposed and this algorithm is implemented in general algebraic modelling system. We present preliminary results of numerical experiments with well-known nonsmooth optimization test problems. We also compare the proposed algorithm with the algorithm that uses the exponential smoothing function as well as with the algorithm based on nonlinear programming reformulation of the finite minimax problem. © 2013 Copyright Taylor and Francis Group, LLC.**Description:**2003011099

DC programming algorithm for clusterwise linear L1 regression

**Authors:**Bagirov, Adil , Taheri, Sona**Date:**2017**Type:**Text , Journal article**Relation:**Journal of the Operations Research Society of China Vol. 5, no. 2 (2017), p. 233-256**Relation:**http://purl.org/au-research/grants/arc/DP140103213**Full Text:**false**Reviewed:****Description:**The aim of this paper is to develop an algorithm for solving the clusterwise linear least absolute deviations regression problem. This problem is formulated as a nonsmooth nonconvex optimization problem, and the objective function is represented as a difference of convex functions. Optimality conditions are derived by using this representation. An algorithm is designed based on the difference of convex representation and an incremental approach. The proposed algorithm is tested using small to large artificial and real-world data sets. © 2017, Operations Research Society of China, Periodicals Agency of Shanghai University, Science Press, and Springer-Verlag Berlin Heidelberg.

Minimizing nonsmooth DC functions via successive DC piecewise-affine approximations

- Gaudioso, Manlio, Giallombardo, Giovanni, Miglionico, Giovanna, Bagirov, Adil

**Authors:**Gaudioso, Manlio , Giallombardo, Giovanni , Miglionico, Giovanna , Bagirov, Adil**Date:**2018**Type:**Text , Journal article**Relation:**Journal of Global Optimization Vol. 71, no. 1 (2018), p. 37-55**Full Text:**false**Reviewed:****Description:**We introduce a proximal bundle method for the numerical minimization of a nonsmooth difference-of-convex (DC) function. Exploiting some classic ideas coming from cutting-plane approaches for the convex case, we iteratively build two separate piecewise-affine approximations of the component functions, grouping the corresponding information in two separate bundles. In the bundle of the first component, only information related to points close to the current iterate are maintained, while the second bundle only refers to a global model of the corresponding component function. We combine the two convex piecewise-affine approximations, and generate a DC piecewise-affine model, which can also be seen as the pointwise maximum of several concave piecewise-affine functions. Such a nonconvex model is locally approximated by means of an auxiliary quadratic program, whose solution is used to certify approximate criticality or to generate a descent search-direction, along with a predicted reduction, that is next explored in a line-search setting. To improve the approximation properties at points that are far from the current iterate a supplementary quadratic program is also introduced to generate an alternative more promising search-direction. We discuss the main convergence issues of the line-search based proximal bundle method, and provide computational results on a set of academic benchmark test problems. © 2017, Springer Science+Business Media, LLC.

Aggregate codifferential method for nonsmooth DC optimization

- Tor, Ali, Bagirov, Adil, Karasozen, Bulent

**Authors:**Tor, Ali , Bagirov, Adil , Karasozen, Bulent**Date:**2014**Type:**Text , Journal article**Relation:**Journal of Computational and Applied Mathematics Vol. 259, no. Part B (2014), p. 851-867**Full Text:**false**Reviewed:****Description:**A new algorithm is developed based on the concept of codifferential for minimizing the difference of convex nonsmooth functions. Since the computation of the whole codifferential is not always possible, we use a fixed number of elements from the codifferential to compute the search directions. The convergence of the proposed algorithm is proved. The efficiency of the algorithm is demonstrated by comparing it with the subgradient, the truncated codifferential and the proximal bundle methods using nonsmooth optimization test problems.

A new nonsmooth optimization algorithm for minimum sum-of-squares clustering problems

- Bagirov, Adil, Yearwood, John

**Authors:**Bagirov, Adil , Yearwood, John**Date:**2006**Type:**Text , Journal article**Relation:**European Journal of Operational Research Vol. 170, no. 2 (2006), p. 578-596**Full Text:**false**Reviewed:****Description:**The minimum sum-of-squares clustering problem is formulated as a problem of nonsmooth, nonconvex optimization, and an algorithm for solving the former problem based on nonsmooth optimization techniques is developed. The issue of applying this algorithm to large data sets is discussed. Results of numerical experiments have been presented which demonstrate the effectiveness of the proposed algorithm. © 2004 Elsevier B.V. All rights reserved.**Description:**C1**Description:**2003001520

Unsupervised and supervised data classification via nonsmooth and global optimisation

- Bagirov, Adil, Rubinov, Alex, Sukhorukova, Nadezda, Yearwood, John

**Authors:**Bagirov, Adil , Rubinov, Alex , Sukhorukova, Nadezda , Yearwood, John**Date:**2003**Type:**Text , Journal article**Relation:**Top Vol. 11, no. 1 (2003), p. 1-92**Full Text:****Reviewed:****Description:**We examine various methods for data clustering and data classification that are based on the minimization of the so-called cluster function and its modications. These functions are nonsmooth and nonconvex. We use Discrete Gradient methods for their local minimization. We consider also a combination of this method with the cutting angle method for global minimization. We present and discuss results of numerical experiments.**Description:**C1**Description:**2003000421

**Authors:**Bagirov, Adil , Rubinov, Alex , Sukhorukova, Nadezda , Yearwood, John**Date:**2003**Type:**Text , Journal article**Relation:**Top Vol. 11, no. 1 (2003), p. 1-92**Full Text:****Reviewed:****Description:**We examine various methods for data clustering and data classification that are based on the minimization of the so-called cluster function and its modications. These functions are nonsmooth and nonconvex. We use Discrete Gradient methods for their local minimization. We consider also a combination of this method with the cutting angle method for global minimization. We present and discuss results of numerical experiments.**Description:**C1**Description:**2003000421

A proximal bundle method for nonsmooth DC optimization utilizing nonconvex cutting planes

- Joki, Kaisa, Bagirov, Adil, Karmitsa, Napsu, Makela, Marko

**Authors:**Joki, Kaisa , Bagirov, Adil , Karmitsa, Napsu , Makela, Marko**Date:**2017**Type:**Text , Journal article**Relation:**Journal of Global Optimization Vol. 68, no. 3 (2017), p. 501-535**Relation:**http://purl.org/au-research/grants/arc/DP140103213**Full Text:**false**Reviewed:****Description:**In this paper, we develop a version of the bundle method to solve unconstrained difference of convex (DC) programming problems. It is assumed that a DC representation of the objective function is available. Our main idea is to utilize subgradients of both the first and second components in the DC representation. This subgradient information is gathered from some neighborhood of the current iteration point and it is used to build separately an approximation for each component in the DC representation. By combining these approximations we obtain a new nonconvex cutting plane model of the original objective function, which takes into account explicitly both the convex and the concave behavior of the objective function. We design the proximal bundle method for DC programming based on this new approach and prove the convergence of the method to an -critical point. The algorithm is tested using some academic test problems and the preliminary numerical results have shown the good performance of the new bundle method. An interesting fact is that the new algorithm finds nearly always the global solution in our test problems.

An incremental clustering algorithm based on hyperbolic smoothing

- Bagirov, Adil, Ordin, Burak, Ozturk, Gurkan, Xavier, Adilson

**Authors:**Bagirov, Adil , Ordin, Burak , Ozturk, Gurkan , Xavier, Adilson**Date:**2015**Type:**Text , Journal article**Relation:**Computational Optimization and Applications Vol. 61, no. 1 (2015), p. 219-241**Relation:**http://purl.org/au-research/grants/arc/DP140103213**Full Text:**false**Reviewed:****Description:**Clustering is an important problem in data mining. It can be formulated as a nonsmooth, nonconvex optimization problem. For the most global optimization techniques this problem is challenging even in medium size data sets. In this paper, we propose an approach that allows one to apply local methods of smooth optimization to solve the clustering problems. We apply an incremental approach to generate starting points for cluster centers which enables us to deal with nonconvexity of the problem. The hyperbolic smoothing technique is applied to handle nonsmoothness of the clustering problems and to make it possible application of smooth optimization algorithms to solve them. Results of numerical experiments with eleven real-world data sets and the comparison with state-of-the-art incremental clustering algorithms demonstrate that the smooth optimization algorithms in combination with the incremental approach are powerful alternative to existing clustering algorithms.

Nonsmooth optimization algorithm for solving clusterwise linear regression problems

- Bagirov, Adil, Ugon, Julien, Mirzayeva, Hijran

**Authors:**Bagirov, Adil , Ugon, Julien , Mirzayeva, Hijran**Date:**2015**Type:**Text , Journal article**Relation:**Journal of Optimization Theory and Applications Vol. 164, no. 3 (2015), p. 755-780**Relation:**http://purl.org/au-research/grants/arc/DP140103213**Full Text:**false**Reviewed:****Description:**Clusterwise linear regression consists of finding a number of linear regression functions each approximating a subset of the data. In this paper, the clusterwise linear regression problem is formulated as a nonsmooth nonconvex optimization problem and an algorithm based on an incremental approach and on the discrete gradient method of nonsmooth optimization is designed to solve it. This algorithm incrementally divides the whole dataset into groups which can be easily approximated by one linear regression function. A special procedure is introduced to generate good starting points for solving global optimization problems at each iteration of the incremental algorithm. The algorithm is compared with the multi-start Spath and the incremental algorithms on several publicly available datasets for regression analysis.

A heuristic algorithm for solving the minimum sum-of-squares clustering problems

**Authors:**Ordin, Burak , Bagirov, Adil**Date:**2015**Type:**Text , Journal article**Relation:**Journal of Global Optimization Vol. 61, no. 2 (2015), p. 341-361**Relation:**http://purl.org/au-research/grants/arc/DP140103213**Full Text:**false**Reviewed:****Description:**Clustering is an important task in data mining. It can be formulated as a global optimization problem which is challenging for existing global optimization techniques even in medium size data sets. Various heuristics were developed to solve the clustering problem. The global k-means and modified global k-means are among most efficient heuristics for solving the minimum sum-of-squares clustering problem. However, these algorithms are not always accurate in finding global or near global solutions to the clustering problem. In this paper, we introduce a new algorithm to improve the accuracy of the modified global k-means algorithm in finding global solutions. We use an auxiliary cluster problem to generate a set of initial points and apply the k-means algorithm starting from these points to find the global solution to the clustering problems. Numerical results on 16 real-world data sets clearly demonstrate the superiority of the proposed algorithm over the global and modified global k-means algorithms in finding global solutions to clustering problems.

Double bundle method for finding clarke stationary points in nonsmooth dc programming

- Joki, Kaisa, Bagirov, Adil, Karmitsa, Napsu, Makela, Marko, Taheri, Sona

**Authors:**Joki, Kaisa , Bagirov, Adil , Karmitsa, Napsu , Makela, Marko , Taheri, Sona**Date:**2018**Type:**Text , Journal article**Relation:**SIAM Journal on Optimization Vol. 28, no. 2 (2018), p. 1892-1919**Relation:**http://purl.org/au-research/grants/arc/DP140103213**Full Text:****Reviewed:****Description:**The aim of this paper is to introduce a new proximal double bundle method for unconstrained nonsmooth optimization, where the objective function is presented as a difference of two convex (DC) functions. The novelty in our method is a new escape procedure which enables us to guarantee approximate Clarke stationarity for solutions by utilizing the DC components of the objective function. This optimality condition is stronger than the criticality condition typically used in DC programming. Moreover, if a candidate solution is not approximate Clarke stationary, then the escape procedure returns a descent direction. With this escape procedure, we can avoid some shortcomings encountered when criticality is used. The finite termination of the double bundle method to an approximate Clarke stationary point is proved by assuming that the subdifferentials of DC components are polytopes. Finally, some encouraging numerical results are presented.

**Authors:**Joki, Kaisa , Bagirov, Adil , Karmitsa, Napsu , Makela, Marko , Taheri, Sona**Date:**2018**Type:**Text , Journal article**Relation:**SIAM Journal on Optimization Vol. 28, no. 2 (2018), p. 1892-1919**Relation:**http://purl.org/au-research/grants/arc/DP140103213**Full Text:****Reviewed:****Description:**The aim of this paper is to introduce a new proximal double bundle method for unconstrained nonsmooth optimization, where the objective function is presented as a difference of two convex (DC) functions. The novelty in our method is a new escape procedure which enables us to guarantee approximate Clarke stationarity for solutions by utilizing the DC components of the objective function. This optimality condition is stronger than the criticality condition typically used in DC programming. Moreover, if a candidate solution is not approximate Clarke stationary, then the escape procedure returns a descent direction. With this escape procedure, we can avoid some shortcomings encountered when criticality is used. The finite termination of the double bundle method to an approximate Clarke stationary point is proved by assuming that the subdifferentials of DC components are polytopes. Finally, some encouraging numerical results are presented.

Solving DC programs using the cutting angle method

- Ferrer, Albert, Bagirov, Adil, Beliakov, Gleb

**Authors:**Ferrer, Albert , Bagirov, Adil , Beliakov, Gleb**Date:**2015**Type:**Text , Journal article**Relation:**Journal of Global Optimization Vol. 61, no. 1 (2015), p. 71-89**Relation:**http://purl.org/au-research/grants/arc/DP140103213**Full Text:**false**Reviewed:****Description:**In this paper, we propose a new algorithm for global minimization of functions represented as a difference of two convex functions. The proposed method is a derivative free method and it is designed by adapting the extended cutting angle method. We present preliminary results of numerical experiments using test problems with difference of convex objective functions and box-constraints. We also compare the proposed algorithm with a classical one that uses prismatical subdivisions.

**Authors:**Bagirov, Adil , Ugon, Julien**Date:**2018**Type:**Text , Journal article**Relation:**Optimization Methods and Software Vol. 33, no. 1 (2018), p. 194-219**Relation:**http://purl.org/au-research/grants/arc/DP140103213**Full Text:**false**Reviewed:****Description:**The clusterwise linear regression problem is formulated as a nonsmooth nonconvex optimization problem using the squared regression error function. The objective function in this problem is represented as a difference of convex functions. Optimality conditions are derived, and an algorithm is designed based on such a representation. An incremental approach is proposed to generate starting solutions. The algorithm is tested on small to large data sets. © 2017 Informa UK Limited, trading as Taylor & Francis Group.

- Bagirov, Adil, Barton, Andrew, Mala-Jetmarova, Helena, Al Nuaimat, Alia, Ahmed, S. T., Sultanova, Nargiz, Yearwood, John

**Authors:**Bagirov, Adil , Barton, Andrew , Mala-Jetmarova, Helena , Al Nuaimat, Alia , Ahmed, S. T. , Sultanova, Nargiz , Yearwood, John**Date:**2013**Type:**Text , Journal article**Relation:**Mathematical and Computer Modelling Vol. 57, no. 3-4 (2013), p. 873-886**Relation:**http://purl.org/au-research/grants/arc/LP0990908**Full Text:**false**Reviewed:****Description:**The operation of a water distribution system is a complex task which involves scheduling of pumps, regulating water levels of storages, and providing satisfactory water quality to customers at required flow and pressure. Pump scheduling is one of the most important tasks of the operation of a water distribution system as it represents the major part of its operating costs. In this paper, a novel approach for modeling of explicit pump scheduling to minimize energy consumption by pumps is introduced which uses the pump start/end run times as continuous variables, and binary integer variables to describe the pump status at the beginning of the scheduling period. This is different from other approaches where binary integer variables for each hour are typically used, which is considered very impractical from an operational perspective. The problem is formulated as a mixed integer nonlinear programming problem, and a new algorithm is developed for its solution. This algorithm is based on the combination of the grid search with the Hooke-Jeeves pattern search method. The performance of the algorithm is evaluated using literature test problems applying the hydraulic simulation model EPANet. © 2012 Elsevier Ltd.**Description:**2003010583

Comparison of metaheuristic algorithms for pump operation optimization

- Bagirov, Adil, Ahmed, S. T., Barton, Andrew, Mala-Jetmarova, Helena, Al Nuaimat, Alia, Sultanova, Nargiz

**Authors:**Bagirov, Adil , Ahmed, S. T. , Barton, Andrew , Mala-Jetmarova, Helena , Al Nuaimat, Alia , Sultanova, Nargiz**Date:**2012**Type:**Text , Conference paper**Relation:**14th Water Distribution Systems Analysis Conference 2012, WDSA 2012 Vol. 2; Adelaide, Australia; 24th-27th September 2012; p. 886-896**Relation:**http://purl.org/au-research/grants/arc/LP0990908**Full Text:**false**Reviewed:****Description:**Pumping cost constitutes the main part of the overall operating cost of water distribution systems. There are different optimization formulations of the pumping cost minimization problem including those with application of continuous and integer programming approaches. To date mainly various metaheuristics have been applied to solve this problem. However, the comprehensive comparison of those metaheuristics has not been done. Such a comparison is important to identify strengths and weaknesses of different algorithms which reflects on their performance. In this paper, we present a methodology for comparative analysis of widely used metaheuristics for solving the pumping cost minimization problem. This methodology includes the following comparison criteria: (a) the "optimal solution" obtained; (b) the efficiency; and (c) robustness. Algorithms applied are: particle swarm optimization, artificial bee colony and firefly algorithms. These algorithms were applied to one test problem available in the literature. The results obtained demonstrate that the artificial bee colony is the most robust and the firefly is the most efficient and accurate algorithm for this test problem. Funding :ARC

An algorithm for clustering based on non-smooth optimization techniques

- Bagirov, Adil, Rubinov, Alex, Sukhorukova, Nadezda, Yearwood, John

**Authors:**Bagirov, Adil , Rubinov, Alex , Sukhorukova, Nadezda , Yearwood, John**Date:**2003**Type:**Text , Journal article**Relation:**International Transactions in Operational Research Vol. 10, no. 6 (2003), p. 611-617**Full Text:**false**Reviewed:****Description:**The problem of cluster analysis is formulated as a problem of non-smooth, non-convex optimization, and an algorithm for solving the cluster analysis problem based on non-smooth optimization techniques is developed. We discuss applications of this algorithm in large databases. Results of numerical experiments are presented to demonstrate the effectiveness of this algorithm.**Description:**C1**Description:**2003000422

An approximate subgradient algorithm for unconstrained nonsmooth, nonconvex optimization

- Bagirov, Adil, Ganjehlou, Asef Nazari

**Authors:**Bagirov, Adil , Ganjehlou, Asef Nazari**Date:**2008**Type:**Text , Journal article**Relation:**Mathematical Methods of Operations Research Vol. 67, no. 2 (2008), p. 187-206**Relation:**http://purl.org/au-research/grants/arc/DP0666061**Full Text:****Reviewed:****Description:**In this paper a new algorithm for minimizing locally Lipschitz functions is developed. Descent directions in this algorithm are computed by solving a system of linear inequalities. The convergence of the algorithm is proved for quasidifferentiable semismooth functions. We present the results of numerical experiments with both regular and nonregular objective functions. We also compare the proposed algorithm with two different versions of the subgradient method using the results of numerical experiments. These results demonstrate the superiority of the proposed algorithm over the subgradient method. © 2007 Springer-Verlag.**Description:**C1

**Authors:**Bagirov, Adil , Ganjehlou, Asef Nazari**Date:**2008**Type:**Text , Journal article**Relation:**Mathematical Methods of Operations Research Vol. 67, no. 2 (2008), p. 187-206**Relation:**http://purl.org/au-research/grants/arc/DP0666061**Full Text:****Reviewed:****Description:**In this paper a new algorithm for minimizing locally Lipschitz functions is developed. Descent directions in this algorithm are computed by solving a system of linear inequalities. The convergence of the algorithm is proved for quasidifferentiable semismooth functions. We present the results of numerical experiments with both regular and nonregular objective functions. We also compare the proposed algorithm with two different versions of the subgradient method using the results of numerical experiments. These results demonstrate the superiority of the proposed algorithm over the subgradient method. © 2007 Springer-Verlag.**Description:**C1

Classification through incremental max-min separability

- Bagirov, Adil, Ugon, Julien, Webb, Dean, Karasozen, Bulent

**Authors:**Bagirov, Adil , Ugon, Julien , Webb, Dean , Karasozen, Bulent**Date:**2011**Type:**Text , Journal article**Relation:**Pattern Analysis and Applications Vol. 14, no. 2 (2011), p. 165-174**Relation:**http://purl.org/au-research/grants/arc/DP0666061**Full Text:**false**Reviewed:****Description:**Piecewise linear functions can be used to approximate non-linear decision boundaries between pattern classes. Piecewise linear boundaries are known to provide efficient real-time classifiers. However, they require a long training time. Finding piecewise linear boundaries between sets is a difficult optimization problem. Most approaches use heuristics to avoid solving this problem, which may lead to suboptimal piecewise linear boundaries. In this paper, we propose an algorithm for globally training hyperplanes using an incremental approach. Such an approach allows one to find a near global minimizer of the classification error function and to compute as few hyperplanes as needed for separating sets. We apply this algorithm for solving supervised data classification problems and report the results of numerical experiments on real-world data sets. These results demonstrate that the new algorithm requires a reasonable training time and its test set accuracy is consistently good on most data sets compared with mainstream classifiers. © 2010 Springer-Verlag London Limited.

Are you sure you would like to clear your session, including search history and login status?