Global optimization of marginal functions with applications to economic equilibrium
- Authors: Bagirov, Adil , Rubinov, Alex
- Date: 2001
- Type: Text , Journal article
- Relation: Journal of Global Optimization Vol. 20, no. 3-4 (Aug 2001), p. 215-237
- Full Text: false
- Reviewed:
- Description: We discuss the applicability of the cutting angle method to global minimization of marginal functions. The search of equilibrium prices in the exchange model can be reduced to the global minimization of certain functions, which include marginal functions. This problem has been approximately solved by the cutting angle method. Results of numerical experiments are presented and discussed.
A quasisecant method for minimizing nonsmooth functions
- Authors: Bagirov, Adil , Ganjehlou, Asef Nazari
- Date: 2010
- Type: Text , Journal article
- Relation: Optimization Methods and Software Vol. 25, no. 1 (2010), p. 3-18
- Relation: http://purl.org/au-research/grants/arc/DP0666061
- Full Text: false
- Reviewed:
- Description: We present an algorithm to locally minimize nonsmooth, nonconvex functions. In order to find descent directions, the notion of quasisecants, introduced in this paper, is applied. We prove that the algorithm converges to Clarke stationary points. Numerical results are presented demonstrating the applicability of the proposed algorithm to a wide variety of nonsmooth, nonconvex optimization problems. We also compare the proposed algorithm with the bundle method using numerical results.
A global optimization approach to classification
- Authors: Bagirov, Adil , Rubinov, Alex , Yearwood, John
- Date: 2002
- Type: Text , Journal article
- Relation: Optimization and Engineering Vol. 9, no. 7 (2002), p. 129-155
- Full Text: false
- Reviewed:
- Description: In this paper is presented an hybrid algorithm for finding the absolute extreme point of a multimodal scalar function of many variables. The algorithm is suitable when the objective function is expensive to compute, the computation can be affected by noise and/or partial derivatives cannot be calculated. The method used is a genetic modification of a previous algorithm based on the Prices method. All information about behavior of objective function collected on previous iterates are used to chose new evaluation points. The genetic part of the algorithm is very effective to escape from local attractors of the algorithm and assures convergence in probability to the global optimum. The proposed algorithm has been tested on a large set of multimodal test problems outperforming both the modified Prices algorithm and classical genetic approach.
- Description: C1
- Description: 2003000061
Alexander Rubinov - An outstanding scholar
- Authors: Bagirov, Adil
- Date: 2010
- Type: Text , Journal article
- Relation: Pacific Journal of Optimization Vol. 6, no. 2, Suppl. 1 (2010), p. 203-209
- Full Text: false
Incremental DC optimization algorithm for large-scale clusterwise linear regression
- Authors: Bagirov, Adil , Taheri, Sona , Cimen, Emre
- Date: 2021
- Type: Text , Journal article
- Relation: Journal of Computational and Applied Mathematics Vol. 389, no. (2021), p. 1-17
- Relation: https://purl.org/au-research/grants/arc/DP190100580
- Full Text: false
- Reviewed:
- Description: The objective function in the nonsmooth optimization model of the clusterwise linear regression (CLR) problem with the squared regression error is represented as a difference of two convex functions. Then using the difference of convex algorithm (DCA) approach the CLR problem is replaced by the sequence of smooth unconstrained optimization subproblems. A new algorithm based on the DCA and the incremental approach is designed to solve the CLR problem. We apply the Quasi-Newton method to solve the subproblems. The proposed algorithm is evaluated using several synthetic and real-world data sets for regression and compared with other algorithms for CLR. Results demonstrate that the DCA based algorithm is efficient for solving CLR problems with the large number of data points and in particular, outperforms other algorithms when the number of input variables is small. © 2020 Elsevier B.V.
Cutting angle method and a local search
- Authors: Bagirov, Adil , Rubinov, Alex
- Date: 2003
- Type: Text , Journal article
- Relation: Journal of Global Optimization Vol. 27, no. 2-3 (Nov 2003), p. 193-213
- Full Text: false
- Reviewed:
- Description: The paper deals with combinations of the cutting angle method in global optimization and a local search. We propose to use special transformed objective functions for each intermediate use of the cutting angle method. We report results of numerical experiments which demonstrate that the proposed approach is very beneficial in the search for a global minimum.
- Description: C1
- Description: 2003000438
Application of derivative free methods for production optimization
- Authors: Bagirov, Adil , Mason, T. L. , Ghosh, Moumita
- Date: 2006
- Type: Text , Journal article
- Relation: Applied and Computational Mathematics Vol. 5, no. 1 (2006), p. 94-105
- Full Text: false
- Reviewed:
- Description: Continuous gas lift is a high value optimization proposition, where high-pressure gas is injected at various depths, into oil production well to lightened the fluid column and so improve production and recovery. Gas lift optimization models as a surrogate for optimization planning, are usually nonconvex and even nonsmooth. Moreover, in many situations the objective and/or constraint functions in these problems are not known analytically. Most of traditional methods of optimization cannot be applied to solve such problems. Derivative free methods seem to be better choice for solving such problems. In this paper, we compare two different derivative free methods, our variant of the discrete gradient method and the generalized descent method for solving nonlinear gas lift optimization problems. We consider two different gas lift optimization problems. The objective functions in these problems are separable, nonsmooth and nonconvex. Although both algorithms produce satisfactory results, however the discrete gradient method better deals with noisy data and produces better results.
- Description: C1
- Description: 2003001714
A method for minimization of quasidifferentiable functions
- Authors: Bagirov, Adil
- Date: 2002
- Type: Text , Journal article
- Relation: Optimization Methods and Software Vol. 17, no. 1 (2002), p. 31-60
- Full Text: false
- Reviewed:
- Description: In this paper, we propose a new method for the unconstrained minimization of a function presented as a difference of two convex functions. This method is based on continuous approximations to the Demyanov-Rubinov quasidifferential. First, a terminating algorithm for the computation of a descent direction of the objective function is described. Then we present a minimization algorithm and study its convergence. An implementable version of this algorithm is discussed. Finally, we report the results of preliminary numerical experiments.
- Description: C1
- Description: 2003000064
A heuristic algorithm for solving the minimum sum-of-squares clustering problems
- Authors: Ordin, Burak , Bagirov, Adil
- Date: 2015
- Type: Text , Journal article
- Relation: Journal of Global Optimization Vol. 61, no. 2 (2015), p. 341-361
- Relation: http://purl.org/au-research/grants/arc/DP140103213
- Full Text: false
- Reviewed:
- Description: Clustering is an important task in data mining. It can be formulated as a global optimization problem which is challenging for existing global optimization techniques even in medium size data sets. Various heuristics were developed to solve the clustering problem. The global k-means and modified global k-means are among most efficient heuristics for solving the minimum sum-of-squares clustering problem. However, these algorithms are not always accurate in finding global or near global solutions to the clustering problem. In this paper, we introduce a new algorithm to improve the accuracy of the modified global k-means algorithm in finding global solutions. We use an auxiliary cluster problem to generate a set of initial points and apply the k-means algorithm starting from these points to find the global solution to the clustering problems. Numerical results on 16 real-world data sets clearly demonstrate the superiority of the proposed algorithm over the global and modified global k-means algorithms in finding global solutions to clustering problems.
Codifferential method for minimizing nonsmooth DC functions
- Authors: Bagirov, Adil , Ugon, Julien
- Date: 2011
- Type: Text , Journal article
- Relation: Journal of Global Optimization Vol. 50, no. 1 (2011), p. 3-22
- Relation: http://purl.org/au-research/grants/arc/DP0666061
- Full Text: false
- Reviewed:
- Description: In this paper, a new algorithm to locally minimize nonsmooth functions represented as a difference of two convex functions (DC functions) is proposed. The algorithm is based on the concept of codifferential. It is assumed that DC decomposition of the objective function is known a priori. We develop an algorithm to compute descent directions using a few elements from codifferential. The convergence of the minimization algorithm is studied and its comparison with different versions of the bundle methods using results of numerical experiments is given. © 2010 Springer Science+Business Media, LLC.
An algorithm for clusterwise linear regression based on smoothing techniques
- Authors: Bagirov, Adil , Ugon, Julien , Mirzayeva, Hijran
- Date: 2014
- Type: Text , Journal article
- Relation: Optimization Letters Vol. 9, no. 2 (2014), p. 375-390
- Full Text: false
- Reviewed:
- Description: We propose an algorithm based on an incremental approach and smoothing techniques to solve clusterwise linear regression (CLR) problems. This algorithm incrementally divides the whole data set into groups which can be easily approximated by one linear regression function. A special procedure is introduced to generate an initial solution for solving global optimization problems at each iteration of the incremental algorithm. Such an approach allows one to find global or approximate global solutions to the CLR problems. The algorithm is tested using several data sets for regression analysis and compared with the multistart and incremental Spath algorithms.
Comparison of metaheuristic algorithms for pump operation optimization
- Authors: Bagirov, Adil , Ahmed, S. T. , Barton, Andrew , Mala-Jetmarova, Helena , Al Nuaimat, Alia , Sultanova, Nargiz
- Date: 2012
- Type: Text , Conference paper
- Relation: 14th Water Distribution Systems Analysis Conference 2012, WDSA 2012 Vol. 2; Adelaide, Australia; 24th-27th September 2012; p. 886-896
- Relation: http://purl.org/au-research/grants/arc/LP0990908
- Full Text: false
- Reviewed:
- Description: Pumping cost constitutes the main part of the overall operating cost of water distribution systems. There are different optimization formulations of the pumping cost minimization problem including those with application of continuous and integer programming approaches. To date mainly various metaheuristics have been applied to solve this problem. However, the comprehensive comparison of those metaheuristics has not been done. Such a comparison is important to identify strengths and weaknesses of different algorithms which reflects on their performance. In this paper, we present a methodology for comparative analysis of widely used metaheuristics for solving the pumping cost minimization problem. This methodology includes the following comparison criteria: (a) the "optimal solution" obtained; (b) the efficiency; and (c) robustness. Algorithms applied are: particle swarm optimization, artificial bee colony and firefly algorithms. These algorithms were applied to one test problem available in the literature. The results obtained demonstrate that the artificial bee colony is the most robust and the firefly is the most efficient and accurate algorithm for this test problem. Funding :ARC
Preface of the special issue OR: Connecting sciences supported by global optimization related to the 25th European conference on operational research (EURO XXV 2012)
- Authors: Bagirov, Adil , Miettinen, Kaisa , Weber, Gerhard-Wilhelm
- Date: 2014
- Type: Text , Journal article
- Relation: Journal of Global Optimization Vol. 60, no. 1 (June 2014), p. 1-3
- Full Text: false
- Reviewed:
- Description: C1
Limited memory discrete gradient bundle method for nonsmooth derivative-free optimization
- Authors: Karmitsa, Napsu , Bagirov, Adil
- Date: 2012
- Type: Text , Journal article
- Relation: Optimization Vol. 61, no. 12 (2012), p. 1491-1509
- Full Text: false
- Reviewed:
- Description: Typically, practical nonsmooth optimization problems involve functions with hundreds of variables. Moreover, there are many practical problems where the computation of even one subgradient is either a difficult or an impossible task. In such cases derivative-free methods are the better (or only) choice since they do not use explicit computation of subgradients. However, these methods require a large number of function evaluations even for moderately large problems. In this article, we propose an efficient derivative-free limited memory discrete gradient bundle method for nonsmooth, possibly nonconvex optimization. The convergence of the proposed method is proved for locally Lipschitz continuous functions and the numerical experiments to be presented confirm the usability of the method especially for medium size and large-scale problems. © 2012 Copyright Taylor and Francis Group, LLC.
- Description: 2003010398
An algorithm for minimization of pumping costs in water distribution systems using a novel approach to pump scheduling
- Authors: Bagirov, Adil , Barton, Andrew , Mala-Jetmarova, Helena , Al Nuaimat, Alia , Ahmed, S. T. , Sultanova, Nargiz , Yearwood, John
- Date: 2013
- Type: Text , Journal article
- Relation: Mathematical and Computer Modelling Vol. 57, no. 3-4 (2013), p. 873-886
- Relation: http://purl.org/au-research/grants/arc/LP0990908
- Full Text: false
- Reviewed:
- Description: The operation of a water distribution system is a complex task which involves scheduling of pumps, regulating water levels of storages, and providing satisfactory water quality to customers at required flow and pressure. Pump scheduling is one of the most important tasks of the operation of a water distribution system as it represents the major part of its operating costs. In this paper, a novel approach for modeling of explicit pump scheduling to minimize energy consumption by pumps is introduced which uses the pump start/end run times as continuous variables, and binary integer variables to describe the pump status at the beginning of the scheduling period. This is different from other approaches where binary integer variables for each hour are typically used, which is considered very impractical from an operational perspective. The problem is formulated as a mixed integer nonlinear programming problem, and a new algorithm is developed for its solution. This algorithm is based on the combination of the grid search with the Hooke-Jeeves pattern search method. The performance of the algorithm is evaluated using literature test problems applying the hydraulic simulation model EPANet. © 2012 Elsevier Ltd.
- Description: 2003010583
Hyperbolic smoothing function method for minimax problems
- Authors: Bagirov, Adil , Al Nuaimat, Alia , Sultanova, Nargiz
- Date: 2013
- Type: Text , Journal article
- Relation: Optimization Vol. 62, no. 6 (2013), p. 759-782
- Full Text: false
- Reviewed:
- Description: In this article, an approach for solving finite minimax problems is proposed. This approach is based on the use of hyperbolic smoothing functions. In order to apply the hyperbolic smoothing we reformulate the objective function in the minimax problem and study the relationship between the original minimax and reformulated problems. We also study main properties of the hyperbolic smoothing function. Based on these results an algorithm for solving the finite minimax problem is proposed and this algorithm is implemented in general algebraic modelling system. We present preliminary results of numerical experiments with well-known nonsmooth optimization test problems. We also compare the proposed algorithm with the algorithm that uses the exponential smoothing function as well as with the algorithm based on nonlinear programming reformulation of the finite minimax problem. © 2013 Copyright Taylor and Francis Group, LLC.
- Description: 2003011099
Subgradient Method for Nonconvex Nonsmooth Optimization
- Authors: Bagirov, Adil , Jin, L. , Karmitsa, Napsu , Al Nuaimat, A. , Sultanova, Nargiz
- Date: 2012
- Type: Text , Journal article
- Relation: Journal of Optimization Theory and Applications Vol.157, no.2 (2012), p.416–435
- Full Text: false
- Reviewed:
- Description: In this paper, we introduce a new method for solving nonconvex nonsmooth optimization problems. It uses quasisecants, which are subgradients computed in some neighborhood of a point. The proposed method contains simple procedures for finding descent directions and for solving line search subproblems. The convergence of the method is studied and preliminary results of numerical experiments are presented. The comparison of the proposed method with the subgradient and the proximal bundle methods is demonstrated using results of numerical experiments. © 2012 Springer Science+Business Media, LLC.
Solving DC programs using the cutting angle method
- Authors: Ferrer, Albert , Bagirov, Adil , Beliakov, Gleb
- Date: 2015
- Type: Text , Journal article
- Relation: Journal of Global Optimization Vol. 61, no. 1 (2015), p. 71-89
- Relation: http://purl.org/au-research/grants/arc/DP140103213
- Full Text: false
- Reviewed:
- Description: In this paper, we propose a new algorithm for global minimization of functions represented as a difference of two convex functions. The proposed method is a derivative free method and it is designed by adapting the extended cutting angle method. We present preliminary results of numerical experiments using test problems with difference of convex objective functions and box-constraints. We also compare the proposed algorithm with a classical one that uses prismatical subdivisions.
A proximal bundle method for nonsmooth DC optimization utilizing nonconvex cutting planes
- Authors: Joki, Kaisa , Bagirov, Adil , Karmitsa, Napsu , Makela, Marko
- Date: 2017
- Type: Text , Journal article
- Relation: Journal of Global Optimization Vol. 68, no. 3 (2017), p. 501-535
- Relation: http://purl.org/au-research/grants/arc/DP140103213
- Full Text: false
- Reviewed:
- Description: In this paper, we develop a version of the bundle method to solve unconstrained difference of convex (DC) programming problems. It is assumed that a DC representation of the objective function is available. Our main idea is to utilize subgradients of both the first and second components in the DC representation. This subgradient information is gathered from some neighborhood of the current iteration point and it is used to build separately an approximation for each component in the DC representation. By combining these approximations we obtain a new nonconvex cutting plane model of the original objective function, which takes into account explicitly both the convex and the concave behavior of the objective function. We design the proximal bundle method for DC programming based on this new approach and prove the convergence of the method to an -critical point. The algorithm is tested using some academic test problems and the preliminary numerical results have shown the good performance of the new bundle method. An interesting fact is that the new algorithm finds nearly always the global solution in our test problems.
Aggregate codifferential method for nonsmooth DC optimization
- Authors: Tor, Ali , Bagirov, Adil , Karasozen, Bulent
- Date: 2014
- Type: Text , Journal article
- Relation: Journal of Computational and Applied Mathematics Vol. 259, no. Part B (2014), p. 851-867
- Full Text: false
- Reviewed:
- Description: A new algorithm is developed based on the concept of codifferential for minimizing the difference of convex nonsmooth functions. Since the computation of the whole codifferential is not always possible, we use a fixed number of elements from the codifferential to compute the search directions. The convergence of the proposed algorithm is proved. The efficiency of the algorithm is demonstrated by comparing it with the subgradient, the truncated codifferential and the proximal bundle methods using nonsmooth optimization test problems.