Gateaux differentiability revisited
- Authors: Abbasi, Malek , Kruger, Alexander , Théra, Michel
- Date: 2021
- Type: Text , Journal article
- Relation: Applied Mathematics and Optimization Vol. 84, no. 3 (2021), p. 3499-3516
- Relation: http://purl.org/au-research/grants/arc/DP160100854
- Full Text:
- Reviewed:
- Description: We revisit some basic concepts and ideas of the classical differential calculus and convex analysis extending them to a broader frame. We reformulate and generalize the notion of Gateaux differentiability and propose new notions of generalized derivative and generalized subdifferential in an arbitrary topological vector space. Meaningful examples preserving the key properties of the original notion of derivative are provided. © 2021, The Author(s), under exclusive licence to Springer Science+Business Media, LLC part of Springer Nature.
Nonsmooth Lyapunov pairs for differential inclusions governed by operators with nonempty interior domain
- Authors: Adly, Samir , Hantoute, Abderrahim , Thera, Michel
- Date: 2016
- Type: Text , Journal article
- Relation: Mathematical Programming Vol. 157, no. 2 (2016), p. 349-374
- Full Text:
- Reviewed:
- Description: The general theory of Lyapunov stability of first-order differential inclusions in Hilbert spaces has been studied by the authors in the previous paper (Adly et al. in Nonlinear Anal 75(3): 985–1008, 2012). This new contribution focuses on the case when the interior of the domain of the maximally monotone operator governing the given differential inclusion is nonempty; this includes in a natural way the finite-dimensional case. The current setting leads to simplified, more explicit criteria and permits some flexibility in the choice of the generalized subdifferentials. Some consequences of the viability of closed sets are given. Our analysis makes use of standard tools from convex and variational analysis. © 2015, Springer-Verlag Berlin Heidelberg and Mathematical Optimization Society.
On Hölder calmness of solution mappings in parametric equilibrium problems
- Authors: Anh, Lam Quoc , Kruger, Alexander , Thao, Nguyen
- Date: 2012
- Type: Text , Journal article
- Relation: TOP Vol. 22, no. 1 (2012), p. 331-342
- Full Text:
- Reviewed:
- Description: We consider parametric equilibrium problems in metric spaces. Sufficient conditions for the Hölder calmness of solutions are established. We also study the Hölder well-posedness for equilibrium problems in metric spaces.
On Holder continuity of solution maps of parametric primal and dual Ky Fan inequalities
- Authors: Anh, Lam Quoc , Khanh, Phan Quoc , Tam, T. N.
- Date: 2015
- Type: Text , Journal article
- Relation: Top Vol. 23, no. 1 (2015), p. 151-167
- Full Text: false
- Reviewed:
- Description: We consider parametric primal and dual Ky Fan inequalities in metric linear spaces. Sufficient conditions for Holder continuity of solutions are established. Many examples are provided to illustrate the essentialness of the imposed assumptions and advantages of the results over existing ones. As applications, we derive this Holder continuity of solutions for constrained minimization and variational inequalities.
Comparative study of RPSALG algorithm for convex semi-infinite programming
- Authors: Auslender, Alfred , Ferrer, Albert , Goberna, Miguel , López, Marco
- Date: 2014
- Type: Text , Journal article
- Relation: Computational Optimization and Applications Vol. 60, no. 1 (2014), p. 59-87
- Full Text: false
- Reviewed:
- Description: The Remez penalty and smoothing algorithm (RPSALG) is a unified framework for penalty and smoothing methods for solving min-max convex semi-infinite programing problems, whose convergence was analyzed in a previous paper of three of the authors. In this paper we consider a partial implementation of RPSALG for solving ordinary convex semi-infinite programming problems. Each iteration of RPSALG involves two types of auxiliary optimization problems: the first one consists of obtaining an approximate solution of some discretized convex problem, while the second one requires to solve a non-convex optimization problem involving the parametric constraints as objective function with the parameter as variable. In this paper we tackle the latter problem with a variant of the cutting angle method called ECAM, a global optimization procedure for solving Lipschitz programming problems. We implement different variants of RPSALG which are compared with the unique publicly available SIP solver, NSIPS, on a battery of test problems.
Global optimization of marginal functions with applications to economic equilibrium
- Authors: Bagirov, Adil , Rubinov, Alex
- Date: 2001
- Type: Text , Journal article
- Relation: Journal of Global Optimization Vol. 20, no. 3-4 (Aug 2001), p. 215-237
- Full Text: false
- Reviewed:
- Description: We discuss the applicability of the cutting angle method to global minimization of marginal functions. The search of equilibrium prices in the exchange model can be reduced to the global minimization of certain functions, which include marginal functions. This problem has been approximately solved by the cutting angle method. Results of numerical experiments are presented and discussed.
A quasisecant method for minimizing nonsmooth functions
- Authors: Bagirov, Adil , Ganjehlou, Asef Nazari
- Date: 2010
- Type: Text , Journal article
- Relation: Optimization Methods and Software Vol. 25, no. 1 (2010), p. 3-18
- Relation: http://purl.org/au-research/grants/arc/DP0666061
- Full Text: false
- Reviewed:
- Description: We present an algorithm to locally minimize nonsmooth, nonconvex functions. In order to find descent directions, the notion of quasisecants, introduced in this paper, is applied. We prove that the algorithm converges to Clarke stationary points. Numerical results are presented demonstrating the applicability of the proposed algorithm to a wide variety of nonsmooth, nonconvex optimization problems. We also compare the proposed algorithm with the bundle method using numerical results.
A global optimization approach to classification
- Authors: Bagirov, Adil , Rubinov, Alex , Yearwood, John
- Date: 2002
- Type: Text , Journal article
- Relation: Optimization and Engineering Vol. 9, no. 7 (2002), p. 129-155
- Full Text: false
- Reviewed:
- Description: In this paper is presented an hybrid algorithm for finding the absolute extreme point of a multimodal scalar function of many variables. The algorithm is suitable when the objective function is expensive to compute, the computation can be affected by noise and/or partial derivatives cannot be calculated. The method used is a genetic modification of a previous algorithm based on the Prices method. All information about behavior of objective function collected on previous iterates are used to chose new evaluation points. The genetic part of the algorithm is very effective to escape from local attractors of the algorithm and assures convergence in probability to the global optimum. The proposed algorithm has been tested on a large set of multimodal test problems outperforming both the modified Prices algorithm and classical genetic approach.
- Description: C1
- Description: 2003000061
Max-min separability
- Authors: Bagirov, Adil
- Date: 2005
- Type: Text , Journal article
- Relation: Optimization Methods and Software Vol. 20, no. 2-3 (2005), p. 271-290
- Full Text:
- Reviewed:
- Description: We consider the problem of discriminating two finite point sets in the n-dimensional space by a finite number of hyperplanes generating a piecewise linear function. If the intersection of these sets is empty, then they can be strictly separated by a max-min of linear functions. An error function is introduced. This function is nonconvex piecewise linear. We discuss an algorithm for its minimization. The results of numerical experiments using some real-world datasets are presented, which show the effectiveness of the proposed approach.
- Description: C1
- Description: 2003001350
Alexander Rubinov - An outstanding scholar
- Authors: Bagirov, Adil
- Date: 2010
- Type: Text , Journal article
- Relation: Pacific Journal of Optimization Vol. 6, no. 2, Suppl. 1 (2010), p. 203-209
- Full Text: false
Incremental DC optimization algorithm for large-scale clusterwise linear regression
- Authors: Bagirov, Adil , Taheri, Sona , Cimen, Emre
- Date: 2021
- Type: Text , Journal article
- Relation: Journal of Computational and Applied Mathematics Vol. 389, no. (2021), p. 1-17
- Relation: https://purl.org/au-research/grants/arc/DP190100580
- Full Text: false
- Reviewed:
- Description: The objective function in the nonsmooth optimization model of the clusterwise linear regression (CLR) problem with the squared regression error is represented as a difference of two convex functions. Then using the difference of convex algorithm (DCA) approach the CLR problem is replaced by the sequence of smooth unconstrained optimization subproblems. A new algorithm based on the DCA and the incremental approach is designed to solve the CLR problem. We apply the Quasi-Newton method to solve the subproblems. The proposed algorithm is evaluated using several synthetic and real-world data sets for regression and compared with other algorithms for CLR. Results demonstrate that the DCA based algorithm is efficient for solving CLR problems with the large number of data points and in particular, outperforms other algorithms when the number of input variables is small. © 2020 Elsevier B.V.
Cutting angle method and a local search
- Authors: Bagirov, Adil , Rubinov, Alex
- Date: 2003
- Type: Text , Journal article
- Relation: Journal of Global Optimization Vol. 27, no. 2-3 (Nov 2003), p. 193-213
- Full Text: false
- Reviewed:
- Description: The paper deals with combinations of the cutting angle method in global optimization and a local search. We propose to use special transformed objective functions for each intermediate use of the cutting angle method. We report results of numerical experiments which demonstrate that the proposed approach is very beneficial in the search for a global minimum.
- Description: C1
- Description: 2003000438
Comparative analysis of the cutting angle and simulated annealing methods in global optimization
- Authors: Bagirov, Adil , Zhang, Jiapu
- Date: 2003
- Type: Text , Journal article
- Relation: Optimization Vol. 52, no. 4-5 (2003), p. 363-378
- Full Text:
- Reviewed:
- Description: This article presents a comparative analysis of two methods of global optimization: the simulated annealing method and a method based on a combination of the cutting angle method and a local search. This analysis is carried out using results of numerical experiments. These results demonstrate that the combined method is more effective than the simulated annealing method.
- Description: C1
- Description: 2003000436
Application of derivative free methods for production optimization
- Authors: Bagirov, Adil , Mason, T. L. , Ghosh, Moumita
- Date: 2006
- Type: Text , Journal article
- Relation: Applied and Computational Mathematics Vol. 5, no. 1 (2006), p. 94-105
- Full Text: false
- Reviewed:
- Description: Continuous gas lift is a high value optimization proposition, where high-pressure gas is injected at various depths, into oil production well to lightened the fluid column and so improve production and recovery. Gas lift optimization models as a surrogate for optimization planning, are usually nonconvex and even nonsmooth. Moreover, in many situations the objective and/or constraint functions in these problems are not known analytically. Most of traditional methods of optimization cannot be applied to solve such problems. Derivative free methods seem to be better choice for solving such problems. In this paper, we compare two different derivative free methods, our variant of the discrete gradient method and the generalized descent method for solving nonlinear gas lift optimization problems. We consider two different gas lift optimization problems. The objective functions in these problems are separable, nonsmooth and nonconvex. Although both algorithms produce satisfactory results, however the discrete gradient method better deals with noisy data and produces better results.
- Description: C1
- Description: 2003001714
Discrete gradient method : Derivative-free method for nonsmooth optimization
- Authors: Bagirov, Adil , Karasozen, Bulent , Sezer, Monsalve
- Date: 2008
- Type: Text , Journal article
- Relation: Journal of Optimization Theory and Applications Vol. 137, no. 2 (2008), p. 317-334
- Relation: http://purl.org/au-research/grants/arc/DP0666061
- Full Text:
- Reviewed:
- Description: A new derivative-free method is developed for solving unconstrained nonsmooth optimization problems. This method is based on the notion of a discrete gradient. It is demonstrated that the discrete gradients can be used to approximate subgradients of a broad class of nonsmooth functions. It is also shown that the discrete gradients can be applied to find descent directions of nonsmooth functions. The preliminary results of numerical experiments with unconstrained nonsmooth optimization problems as well as the comparison of the proposed method with the nonsmooth optimization solver DNLP from CONOPT-GAMS and the derivative-free optimization solver CONDOR are presented. © 2007 Springer Science+Business Media, LLC.
- Description: C1
A method for minimization of quasidifferentiable functions
- Authors: Bagirov, Adil
- Date: 2002
- Type: Text , Journal article
- Relation: Optimization Methods and Software Vol. 17, no. 1 (2002), p. 31-60
- Full Text: false
- Reviewed:
- Description: In this paper, we propose a new method for the unconstrained minimization of a function presented as a difference of two convex functions. This method is based on continuous approximations to the Demyanov-Rubinov quasidifferential. First, a terminating algorithm for the computation of a descent direction of the objective function is described. Then we present a minimization algorithm and study its convergence. An implementable version of this algorithm is discussed. Finally, we report the results of preliminary numerical experiments.
- Description: C1
- Description: 2003000064
Codifferential method for minimizing nonsmooth DC functions
- Authors: Bagirov, Adil , Ugon, Julien
- Date: 2011
- Type: Text , Journal article
- Relation: Journal of Global Optimization Vol. 50, no. 1 (2011), p. 3-22
- Relation: http://purl.org/au-research/grants/arc/DP0666061
- Full Text: false
- Reviewed:
- Description: In this paper, a new algorithm to locally minimize nonsmooth functions represented as a difference of two convex functions (DC functions) is proposed. The algorithm is based on the concept of codifferential. It is assumed that DC decomposition of the objective function is known a priori. We develop an algorithm to compute descent directions using a few elements from codifferential. The convergence of the minimization algorithm is studied and its comparison with different versions of the bundle methods using results of numerical experiments is given. © 2010 Springer Science+Business Media, LLC.
Local optimization method with global multidimensional search
- Authors: Bagirov, Adil , Rubinov, Alex , Zhang, Jiapu
- Date: 2005
- Type: Text , Journal article
- Relation: Journal of Global Optimization Vol. 32, no. 2 (2005), p. 161-179
- Full Text:
- Reviewed:
- Description: This paper presents a new method for solving global optimization problems. We use a local technique based on the notion of discrete gradients for finding a cone of descent directions and then we use a global cutting angle algorithm for finding global minimum within the intersection of the cone and the feasible region. We present results of numerical experiments with well-known test problems and with the so-called cluster function. These results confirm that the proposed algorithms allows one to find a global minimizer or at least a deep local minimizer of a function with a huge amount of shallow local minima. © Springer 2005.
- Description: C1
- Description: 2003001351
An algorithm for clusterwise linear regression based on smoothing techniques
- Authors: Bagirov, Adil , Ugon, Julien , Mirzayeva, Hijran
- Date: 2014
- Type: Text , Journal article
- Relation: Optimization Letters Vol. 9, no. 2 (2014), p. 375-390
- Full Text: false
- Reviewed:
- Description: We propose an algorithm based on an incremental approach and smoothing techniques to solve clusterwise linear regression (CLR) problems. This algorithm incrementally divides the whole data set into groups which can be easily approximated by one linear regression function. A special procedure is introduced to generate an initial solution for solving global optimization problems at each iteration of the incremental algorithm. Such an approach allows one to find global or approximate global solutions to the CLR problems. The algorithm is tested using several data sets for regression analysis and compared with the multistart and incremental Spath algorithms.
Comparison of metaheuristic algorithms for pump operation optimization
- Authors: Bagirov, Adil , Ahmed, S. T. , Barton, Andrew , Mala-Jetmarova, Helena , Al Nuaimat, Alia , Sultanova, Nargiz
- Date: 2012
- Type: Text , Conference paper
- Relation: 14th Water Distribution Systems Analysis Conference 2012, WDSA 2012 Vol. 2; Adelaide, Australia; 24th-27th September 2012; p. 886-896
- Relation: http://purl.org/au-research/grants/arc/LP0990908
- Full Text: false
- Reviewed:
- Description: Pumping cost constitutes the main part of the overall operating cost of water distribution systems. There are different optimization formulations of the pumping cost minimization problem including those with application of continuous and integer programming approaches. To date mainly various metaheuristics have been applied to solve this problem. However, the comprehensive comparison of those metaheuristics has not been done. Such a comparison is important to identify strengths and weaknesses of different algorithms which reflects on their performance. In this paper, we present a methodology for comparative analysis of widely used metaheuristics for solving the pumping cost minimization problem. This methodology includes the following comparison criteria: (a) the "optimal solution" obtained; (b) the efficiency; and (c) robustness. Algorithms applied are: particle swarm optimization, artificial bee colony and firefly algorithms. These algorithms were applied to one test problem available in the literature. The results obtained demonstrate that the artificial bee colony is the most robust and the firefly is the most efficient and accurate algorithm for this test problem. Funding :ARC