Your selections:

5Ugon, Julien
4Karmitsa, Napsu
4Rubinov, Alex
4Sultanova, Nargiz
3Al Nuaimat, Alia
2Ahmed, S. T.
2Barton, Andrew
2Beliakov, Gleb
2Joki, Kaisa
2Karasozen, Bulent
2Makela, Marko
2Mala-Jetmarova, Helena
2Mason, T. L.
2Mirzayeva, Hijran
2Ordin, Burak
2Ozturk, Gurkan
2Taheri, Sona
2Yearwood, John
2Zhang, Jiapu

Show More

Show Less

220102 Applied Mathematics
16Nonsmooth optimization
110802 Computation Theory and Mathematics
9Nonconvex optimization
6Subdifferential
5Derivative-free optimization
4DC programming
4Lipschitz programming
30906 Electrical and Electronic Engineering
3Algorithms
3Bundle method
3Cluster analysis
3Cutting angle method
3DC functions
3Discrete gradient
3Discrete gradient method
3Global optimization
3Regression analysis
3Smoothing techniques

Show More

Show Less

Format Type

Alexander Rubinov - An outstanding scholar

**Authors:**Bagirov, Adil**Date:**2010**Type:**Text , Journal article**Relation:**Pacific Journal of Optimization Vol. 6, no. 2, Suppl. 1 (2010), p. 203-209**Full Text:**false

Minimizing nonsmooth DC functions via successive DC piecewise-affine approximations

- Gaudioso, Manlio, Giallombardo, Giovanni, Miglionico, Giovanna, Bagirov, Adil

**Authors:**Gaudioso, Manlio , Giallombardo, Giovanni , Miglionico, Giovanna , Bagirov, Adil**Date:**2018**Type:**Text , Journal article**Relation:**Journal of Global Optimization Vol. 71, no. 1 (2018), p. 37-55**Full Text:**false**Reviewed:****Description:**We introduce a proximal bundle method for the numerical minimization of a nonsmooth difference-of-convex (DC) function. Exploiting some classic ideas coming from cutting-plane approaches for the convex case, we iteratively build two separate piecewise-affine approximations of the component functions, grouping the corresponding information in two separate bundles. In the bundle of the first component, only information related to points close to the current iterate are maintained, while the second bundle only refers to a global model of the corresponding component function. We combine the two convex piecewise-affine approximations, and generate a DC piecewise-affine model, which can also be seen as the pointwise maximum of several concave piecewise-affine functions. Such a nonconvex model is locally approximated by means of an auxiliary quadratic program, whose solution is used to certify approximate criticality or to generate a descent search-direction, along with a predicted reduction, that is next explored in a line-search setting. To improve the approximation properties at points that are far from the current iterate a supplementary quadratic program is also introduced to generate an alternative more promising search-direction. We discuss the main convergence issues of the line-search based proximal bundle method, and provide computational results on a set of academic benchmark test problems. © 2017, Springer Science+Business Media, LLC.

Application of derivative free methods for production optimization

- Bagirov, Adil, Mason, T. L., Ghosh, Moumita

**Authors:**Bagirov, Adil , Mason, T. L. , Ghosh, Moumita**Date:**2006**Type:**Text , Journal article**Relation:**Applied and Computational Mathematics Vol. 5, no. 1 (2006), p. 94-105**Full Text:**false**Reviewed:****Description:**Continuous gas lift is a high value optimization proposition, where high-pressure gas is injected at various depths, into oil production well to lightened the fluid column and so improve production and recovery. Gas lift optimization models as a surrogate for optimization planning, are usually nonconvex and even nonsmooth. Moreover, in many situations the objective and/or constraint functions in these problems are not known analytically. Most of traditional methods of optimization cannot be applied to solve such problems. Derivative free methods seem to be better choice for solving such problems. In this paper, we compare two different derivative free methods, our variant of the discrete gradient method and the generalized descent method for solving nonlinear gas lift optimization problems. We consider two different gas lift optimization problems. The objective functions in these problems are separable, nonsmooth and nonconvex. Although both algorithms produce satisfactory results, however the discrete gradient method better deals with noisy data and produces better results.**Description:**C1**Description:**2003001714

A global optimization approach to classification

- Bagirov, Adil, Rubinov, Alex, Yearwood, John

**Authors:**Bagirov, Adil , Rubinov, Alex , Yearwood, John**Date:**2002**Type:**Text , Journal article**Relation:**Optimization and Engineering Vol. 9, no. 7 (2002), p. 129-155**Full Text:**false**Reviewed:****Description:**In this paper is presented an hybrid algorithm for finding the absolute extreme point of a multimodal scalar function of many variables. The algorithm is suitable when the objective function is expensive to compute, the computation can be affected by noise and/or partial derivatives cannot be calculated. The method used is a genetic modification of a previous algorithm based on the Prices method. All information about behavior of objective function collected on previous iterates are used to chose new evaluation points. The genetic part of the algorithm is very effective to escape from local attractors of the algorithm and assures convergence in probability to the global optimum. The proposed algorithm has been tested on a large set of multimodal test problems outperforming both the modified Prices algorithm and classical genetic approach.**Description:**C1**Description:**2003000061

Global optimization of marginal functions with applications to economic equilibrium

- Bagirov, Adil, Rubinov, Alex

**Authors:**Bagirov, Adil , Rubinov, Alex**Date:**2001**Type:**Text , Journal article**Relation:**Journal of Global Optimization Vol. 20, no. 3-4 (Aug 2001), p. 215-237**Full Text:**false**Reviewed:****Description:**We discuss the applicability of the cutting angle method to global minimization of marginal functions. The search of equilibrium prices in the exchange model can be reduced to the global minimization of certain functions, which include marginal functions. This problem has been approximately solved by the cutting angle method. Results of numerical experiments are presented and discussed.

A method for minimization of quasidifferentiable functions

**Authors:**Bagirov, Adil**Date:**2002**Type:**Text , Journal article**Relation:**Optimization Methods and Software Vol. 17, no. 1 (2002), p. 31-60**Full Text:**false**Reviewed:****Description:**In this paper, we propose a new method for the unconstrained minimization of a function presented as a difference of two convex functions. This method is based on continuous approximations to the Demyanov-Rubinov quasidifferential. First, a terminating algorithm for the computation of a descent direction of the objective function is described. Then we present a minimization algorithm and study its convergence. An implementable version of this algorithm is discussed. Finally, we report the results of preliminary numerical experiments.**Description:**C1**Description:**2003000064

A proximal bundle method for nonsmooth DC optimization utilizing nonconvex cutting planes

- Joki, Kaisa, Bagirov, Adil, Karmitsa, Napsu, Makela, Marko

**Authors:**Joki, Kaisa , Bagirov, Adil , Karmitsa, Napsu , Makela, Marko**Date:**2017**Type:**Text , Journal article**Relation:**Journal of Global Optimization Vol. 68, no. 3 (2017), p. 501-535**Relation:**http://purl.org/au-research/grants/arc/DP140103213**Full Text:**false**Reviewed:****Description:**In this paper, we develop a version of the bundle method to solve unconstrained difference of convex (DC) programming problems. It is assumed that a DC representation of the objective function is available. Our main idea is to utilize subgradients of both the first and second components in the DC representation. This subgradient information is gathered from some neighborhood of the current iteration point and it is used to build separately an approximation for each component in the DC representation. By combining these approximations we obtain a new nonconvex cutting plane model of the original objective function, which takes into account explicitly both the convex and the concave behavior of the objective function. We design the proximal bundle method for DC programming based on this new approach and prove the convergence of the method to an -critical point. The algorithm is tested using some academic test problems and the preliminary numerical results have shown the good performance of the new bundle method. An interesting fact is that the new algorithm finds nearly always the global solution in our test problems.

**Authors:**Bagirov, Adil , Ugon, Julien**Date:**2018**Type:**Text , Journal article**Relation:**Optimization Methods and Software Vol. 33, no. 1 (2018), p. 194-219**Relation:**http://purl.org/au-research/grants/arc/DP140103213**Full Text:**false**Reviewed:****Description:**The clusterwise linear regression problem is formulated as a nonsmooth nonconvex optimization problem using the squared regression error function. The objective function in this problem is represented as a difference of convex functions. Optimality conditions are derived, and an algorithm is designed based on such a representation. An incremental approach is proposed to generate starting solutions. The algorithm is tested on small to large data sets. © 2017 Informa UK Limited, trading as Taylor & Francis Group.

Comparison of metaheuristic algorithms for pump operation optimization

- Bagirov, Adil, Ahmed, S. T., Barton, Andrew, Mala-Jetmarova, Helena, Al Nuaimat, Alia, Sultanova, Nargiz

**Authors:**Bagirov, Adil , Ahmed, S. T. , Barton, Andrew , Mala-Jetmarova, Helena , Al Nuaimat, Alia , Sultanova, Nargiz**Date:**2012**Type:**Text , Conference paper**Relation:**14th Water Distribution Systems Analysis Conference 2012, WDSA 2012 Vol. 2; Adelaide, Australia; 24th-27th September 2012; p. 886-896**Relation:**http://purl.org/au-research/grants/arc/LP0990908**Full Text:**false**Reviewed:****Description:**Pumping cost constitutes the main part of the overall operating cost of water distribution systems. There are different optimization formulations of the pumping cost minimization problem including those with application of continuous and integer programming approaches. To date mainly various metaheuristics have been applied to solve this problem. However, the comprehensive comparison of those metaheuristics has not been done. Such a comparison is important to identify strengths and weaknesses of different algorithms which reflects on their performance. In this paper, we present a methodology for comparative analysis of widely used metaheuristics for solving the pumping cost minimization problem. This methodology includes the following comparison criteria: (a) the "optimal solution" obtained; (b) the efficiency; and (c) robustness. Algorithms applied are: particle swarm optimization, artificial bee colony and firefly algorithms. These algorithms were applied to one test problem available in the literature. The results obtained demonstrate that the artificial bee colony is the most robust and the firefly is the most efficient and accurate algorithm for this test problem. Funding :ARC

Limited memory discrete gradient bundle method for nonsmooth derivative-free optimization

- Karmitsa, Napsu, Bagirov, Adil

**Authors:**Karmitsa, Napsu , Bagirov, Adil**Date:**2012**Type:**Text , Journal article**Relation:**Optimization Vol. 61, no. 12 (2012), p. 1491-1509**Full Text:**false**Reviewed:****Description:**Typically, practical nonsmooth optimization problems involve functions with hundreds of variables. Moreover, there are many practical problems where the computation of even one subgradient is either a difficult or an impossible task. In such cases derivative-free methods are the better (or only) choice since they do not use explicit computation of subgradients. However, these methods require a large number of function evaluations even for moderately large problems. In this article, we propose an efficient derivative-free limited memory discrete gradient bundle method for nonsmooth, possibly nonconvex optimization. The convergence of the proposed method is proved for locally Lipschitz continuous functions and the numerical experiments to be presented confirm the usability of the method especially for medium size and large-scale problems. © 2012 Copyright Taylor and Francis Group, LLC.**Description:**2003010398

A quasisecant method for minimizing nonsmooth functions

- Bagirov, Adil, Ganjehlou, Asef Nazari

**Authors:**Bagirov, Adil , Ganjehlou, Asef Nazari**Date:**2010**Type:**Text , Journal article**Relation:**Optimization Methods and Software Vol. 25, no. 1 (2010), p. 3-18**Relation:**http://purl.org/au-research/grants/arc/DP0666061**Full Text:**false**Reviewed:****Description:**We present an algorithm to locally minimize nonsmooth, nonconvex functions. In order to find descent directions, the notion of quasisecants, introduced in this paper, is applied. We prove that the algorithm converges to Clarke stationary points. Numerical results are presented demonstrating the applicability of the proposed algorithm to a wide variety of nonsmooth, nonconvex optimization problems. We also compare the proposed algorithm with the bundle method using numerical results.

Codifferential method for minimizing nonsmooth DC functions

**Authors:**Bagirov, Adil , Ugon, Julien**Date:**2011**Type:**Text , Journal article**Relation:**Journal of Global Optimization Vol. 50, no. 1 (2011), p. 3-22**Relation:**http://purl.org/au-research/grants/arc/DP0666061**Full Text:**false**Reviewed:****Description:**In this paper, a new algorithm to locally minimize nonsmooth functions represented as a difference of two convex functions (DC functions) is proposed. The algorithm is based on the concept of codifferential. It is assumed that DC decomposition of the objective function is known a priori. We develop an algorithm to compute descent directions using a few elements from codifferential. The convergence of the minimization algorithm is studied and its comparison with different versions of the bundle methods using results of numerical experiments is given. © 2010 Springer Science+Business Media, LLC.

- Bagirov, Adil, Miettinen, Kaisa, Weber, Gerhard-Wilhelm

**Authors:**Bagirov, Adil , Miettinen, Kaisa , Weber, Gerhard-Wilhelm**Date:**2014**Type:**Text , Journal article**Relation:**Journal of Global Optimization Vol. 60, no. 1 (June 2014), p. 1-3**Full Text:**false**Reviewed:****Description:**C1

**Authors:**Bagirov, Adil , Ugon, Julien**Date:**2018**Type:**Text , Journal article**Relation:**Optimization Methods and Software Vol. 33, no. 1 (2018), p. 194-219**Full Text:**false**Reviewed:****Description:**The clusterwise linear regression problem is formulated as a nonsmooth nonconvex optimization problem using the squared regression error function. The objective function in this problem is represented as a difference of convex functions. Optimality conditions are derived, and an algorithm is designed based on such a representation. An incremental approach is proposed to generate starting solutions. The algorithm is tested on small to large data sets.

A heuristic algorithm for solving the minimum sum-of-squares clustering problems

**Authors:**Ordin, Burak , Bagirov, Adil**Date:**2015**Type:**Text , Journal article**Relation:**Journal of Global Optimization Vol. 61, no. 2 (2015), p. 341-361**Relation:**http://purl.org/au-research/grants/arc/DP140103213**Full Text:**false**Reviewed:****Description:**Clustering is an important task in data mining. It can be formulated as a global optimization problem which is challenging for existing global optimization techniques even in medium size data sets. Various heuristics were developed to solve the clustering problem. The global k-means and modified global k-means are among most efficient heuristics for solving the minimum sum-of-squares clustering problem. However, these algorithms are not always accurate in finding global or near global solutions to the clustering problem. In this paper, we introduce a new algorithm to improve the accuracy of the modified global k-means algorithm in finding global solutions. We use an auxiliary cluster problem to generate a set of initial points and apply the k-means algorithm starting from these points to find the global solution to the clustering problems. Numerical results on 16 real-world data sets clearly demonstrate the superiority of the proposed algorithm over the global and modified global k-means algorithms in finding global solutions to clustering problems.

A sharp augmented Lagrangian-based method in constrained non-convex optimization

- Bagirov, Adil, Ozturk, Gurkan, Kasimbeyli, Refail

**Authors:**Bagirov, Adil , Ozturk, Gurkan , Kasimbeyli, Refail**Date:**2019**Type:**Text , Journal article**Relation:**Optimization Methods and Software Vol. 34, no. 3 (2019), p. 462-488**Full Text:**false**Reviewed:****Description:**In this paper, a novel sharp Augmented Lagrangian-based global optimization method is developed for solving constrained non-convex optimization problems. The algorithm consists of outer and inner loops. At each inner iteration, the discrete gradient method is applied to minimize the sharp augmented Lagrangian function. Depending on the solution found the algorithm stops or updates the dual variables in the inner loop, or updates the upper or lower bounds by going to the outer loop. The convergence results for the proposed method are presented. The performance of the method is demonstrated using a wide range of nonlinear smooth and non-smooth constrained optimization test problems from the literature.

Hyperbolic smoothing function method for minimax problems

- Bagirov, Adil, Al Nuaimat, Alia, Sultanova, Nargiz

**Authors:**Bagirov, Adil , Al Nuaimat, Alia , Sultanova, Nargiz**Date:**2013**Type:**Text , Journal article**Relation:**Optimization Vol. 62, no. 6 (2013), p. 759-782**Full Text:**false**Reviewed:****Description:**In this article, an approach for solving finite minimax problems is proposed. This approach is based on the use of hyperbolic smoothing functions. In order to apply the hyperbolic smoothing we reformulate the objective function in the minimax problem and study the relationship between the original minimax and reformulated problems. We also study main properties of the hyperbolic smoothing function. Based on these results an algorithm for solving the finite minimax problem is proposed and this algorithm is implemented in general algebraic modelling system. We present preliminary results of numerical experiments with well-known nonsmooth optimization test problems. We also compare the proposed algorithm with the algorithm that uses the exponential smoothing function as well as with the algorithm based on nonlinear programming reformulation of the finite minimax problem. © 2013 Copyright Taylor and Francis Group, LLC.**Description:**2003011099

- Bagirov, Adil, Barton, Andrew, Mala-Jetmarova, Helena, Al Nuaimat, Alia, Ahmed, S. T., Sultanova, Nargiz, Yearwood, John

**Authors:**Bagirov, Adil , Barton, Andrew , Mala-Jetmarova, Helena , Al Nuaimat, Alia , Ahmed, S. T. , Sultanova, Nargiz , Yearwood, John**Date:**2013**Type:**Text , Journal article**Relation:**Mathematical and Computer Modelling Vol. 57, no. 3-4 (2013), p. 873-886**Relation:**http://purl.org/au-research/grants/arc/LP0990908**Full Text:**false**Reviewed:****Description:**The operation of a water distribution system is a complex task which involves scheduling of pumps, regulating water levels of storages, and providing satisfactory water quality to customers at required flow and pressure. Pump scheduling is one of the most important tasks of the operation of a water distribution system as it represents the major part of its operating costs. In this paper, a novel approach for modeling of explicit pump scheduling to minimize energy consumption by pumps is introduced which uses the pump start/end run times as continuous variables, and binary integer variables to describe the pump status at the beginning of the scheduling period. This is different from other approaches where binary integer variables for each hour are typically used, which is considered very impractical from an operational perspective. The problem is formulated as a mixed integer nonlinear programming problem, and a new algorithm is developed for its solution. This algorithm is based on the combination of the grid search with the Hooke-Jeeves pattern search method. The performance of the algorithm is evaluated using literature test problems applying the hydraulic simulation model EPANet. © 2012 Elsevier Ltd.**Description:**2003010583

An incremental clustering algorithm based on hyperbolic smoothing

- Bagirov, Adil, Ordin, Burak, Ozturk, Gurkan, Xavier, Adilson

**Authors:**Bagirov, Adil , Ordin, Burak , Ozturk, Gurkan , Xavier, Adilson**Date:**2015**Type:**Text , Journal article**Relation:**Computational Optimization and Applications Vol. 61, no. 1 (2015), p. 219-241**Relation:**http://purl.org/au-research/grants/arc/DP140103213**Full Text:**false**Reviewed:****Description:**Clustering is an important problem in data mining. It can be formulated as a nonsmooth, nonconvex optimization problem. For the most global optimization techniques this problem is challenging even in medium size data sets. In this paper, we propose an approach that allows one to apply local methods of smooth optimization to solve the clustering problems. We apply an incremental approach to generate starting points for cluster centers which enables us to deal with nonconvexity of the problem. The hyperbolic smoothing technique is applied to handle nonsmoothness of the clustering problems and to make it possible application of smooth optimization algorithms to solve them. Results of numerical experiments with eleven real-world data sets and the comparison with state-of-the-art incremental clustering algorithms demonstrate that the smooth optimization algorithms in combination with the incremental approach are powerful alternative to existing clustering algorithms.

Nonsmooth optimization algorithm for solving clusterwise linear regression problems

- Bagirov, Adil, Ugon, Julien, Mirzayeva, Hijran

**Authors:**Bagirov, Adil , Ugon, Julien , Mirzayeva, Hijran**Date:**2015**Type:**Text , Journal article**Relation:**Journal of Optimization Theory and Applications Vol. 164, no. 3 (2015), p. 755-780**Relation:**http://purl.org/au-research/grants/arc/DP140103213**Full Text:**false**Reviewed:****Description:**Clusterwise linear regression consists of finding a number of linear regression functions each approximating a subset of the data. In this paper, the clusterwise linear regression problem is formulated as a nonsmooth nonconvex optimization problem and an algorithm based on an incremental approach and on the discrete gradient method of nonsmooth optimization is designed to solve it. This algorithm incrementally divides the whole dataset into groups which can be easily approximated by one linear regression function. A special procedure is introduced to generate good starting points for solving global optimization problems at each iteration of the incremental algorithm. The algorithm is compared with the multi-start Spath and the incremental algorithms on several publicly available datasets for regression analysis.

Are you sure you would like to clear your session, including search history and login status?