A global optimization approach to classification
- Authors: Bagirov, Adil , Rubinov, Alex , Yearwood, John
- Date: 2002
- Type: Text , Journal article
- Relation: Optimization and Engineering Vol. 9, no. 7 (2002), p. 129-155
- Full Text: false
- Reviewed:
- Description: In this paper is presented an hybrid algorithm for finding the absolute extreme point of a multimodal scalar function of many variables. The algorithm is suitable when the objective function is expensive to compute, the computation can be affected by noise and/or partial derivatives cannot be calculated. The method used is a genetic modification of a previous algorithm based on the Prices method. All information about behavior of objective function collected on previous iterates are used to chose new evaluation points. The genetic part of the algorithm is very effective to escape from local attractors of the algorithm and assures convergence in probability to the global optimum. The proposed algorithm has been tested on a large set of multimodal test problems outperforming both the modified Prices algorithm and classical genetic approach.
- Description: C1
- Description: 2003000061
Local optimization method with global multidimensional search
- Authors: Bagirov, Adil , Rubinov, Alex , Zhang, Jiapu
- Date: 2005
- Type: Text , Journal article
- Relation: Journal of Global Optimization Vol. 32, no. 2 (2005), p. 161-179
- Full Text:
- Reviewed:
- Description: This paper presents a new method for solving global optimization problems. We use a local technique based on the notion of discrete gradients for finding a cone of descent directions and then we use a global cutting angle algorithm for finding global minimum within the intersection of the cone and the feasible region. We present results of numerical experiments with well-known test problems and with the so-called cluster function. These results confirm that the proposed algorithms allows one to find a global minimizer or at least a deep local minimizer of a function with a huge amount of shallow local minima. © Springer 2005.
- Description: C1
- Description: 2003001351
Optimisation solvers and problem formulations for solving a data clustering problem
- Authors: Ugon, Julien , Rubinov, Alex
- Date: 2005
- Type: Text , Conference paper
- Relation: Paper presented at the Sixteenth Australasian Workshop on Combinatorial Algorithms, Ballarat, Victoria : 18th - 21st September, 2005
- Full Text:
- Reviewed:
- Description: A popular apprach for solving complex optimization problems is through relaxation: some constraints are removed in order to have a convex problem approximating the original problem. On the other hand, direct approaches for solving such problems are becoming increasingly powerful. This paper examines two cases drawn from data analysis, in order to compare the two techniques.
- Description: E1
- Description: 2003001437
Methods for global optimization of nonsmooth functions with applications
- Authors: Rubinov, Alex
- Date: 2006
- Type: Text , Journal article
- Relation: Applied and Computational Mathematics Vol. 5, no. 1 (2006), p. 3-15
- Full Text: false
- Reviewed:
- Description: In this survey paper we present some results obtained in the Centre for Informatics and Applied Optimization (CIAO) at University of Ballarat, Australia, in the area of numerical global optimization. We describe a conceptual scheme of two methods developed in CIAO and present results of numerical experiments with some real world problems. The paper is based on a plenary lecture given by the author at the First International Conference on Control and Optimization with Industrial Applications, Baku, Azerbaijan, 2005.
- Description: C1
- Description: 2003001547
Facility location via continuous optimization with discontinuous objective functions
- Authors: Ugon, Julien , Kouhbor, Shahnaz , Mammadov, Musa , Rubinov, Alex , Kruger, Alexander
- Date: 2007
- Type: Text , Journal article
- Relation: ANZIAM Journal Vol. 48, no. 3 (2007), p. 315-325
- Full Text:
- Reviewed:
- Description: Facility location problems are one of the most common applications of optimization methods. Continuous formulations are usually more accurate, but often result in complex problems that cannot be solved using traditional optimization methods. This paper examines the use of a global optimization method - AGOP - for solving location problems where the objective function is discontinuous. This approach is motivated by a real-world application in wireless networks design. © Australian Mathematical Society 2007.
- Description: 2003004859
A multidimensional descent method for global optimization
- Authors: Bagirov, Adil , Rubinov, Alex , Zhang, Jiapu
- Date: 2009
- Type: Text , Journal article
- Relation: Optimization Vol. 58, no. 5 (2009), p. 611-625
- Full Text: false
- Reviewed:
- Description: This article presents a new multidimensional descent method for solving global optimization problems with box-constraints. This is a hybrid method where local search method is used for a local descent and global search is used for further multidimensional search on the subsets of intersection of cones generated by the local search method and the feasible region. The discrete gradient method is used for local search and the cutting angle method is used for global search. Two-and three-dimensional cones are used for the global search. Such an approach allows one, as a rule, to escape local minimizers which are not global ones. The proposed method is local optimization method with strong global search properties. We present results of numerical experiments using both smooth and non-smooth global optimization test problems. These results demonstrate that the proposed algorithm allows one to find a global or a near global minimizer.
Global optimality conditions for some classes of optimization problems
- Authors: Wu, Zhiyou , Rubinov, Alex
- Date: 2009
- Type: Text , Journal article
- Relation: Journal of Optimization Theory and Applications Vol. 145, no. 1 (2009), p. 164-185
- Full Text: false
- Reviewed:
- Description: We establish new necessary and sufficient optimality conditions for global optimization problems. In particular, we establish tractable optimality conditions for the problems of minimizing a weakly convex or concave function subject to standard constraints, such as box constraints, binary constraints, and simplex constraints. We also derive some new necessary and sufficient optimality conditions for quadratic optimization. Our main theoretical tool for establishing these optimality conditions is abstract convexity. © 2009 Springer Science+Business Media, LLC.
Optimality conditions in global optimization and their applications
- Authors: Rubinov, Alex , Wu, Zhiyou
- Date: 2009
- Type: Text , Journal article
- Relation: Mathematical Programming Vol. 120, no. 1 SPEC. ISS. (2009), p. 101-123
- Full Text: false
- Reviewed:
- Description: In this paper we derive necessary and sufficient conditions for some problems of global minimization. Our approach is based on methods of abstract convexity: we use a representation of an upper semicontinuous function as the lower envelope of a family of convex functions. We discuss applications of conditions obtained to the examination of some tractable sufficient conditions for the global minimum and to the theory of inequalities. © 2007 Springer-Verlag.
The modified subgradient algorithm based on feasible values
- Authors: Kasimbeyli, Refail , Ustun, Ozden , Rubinov, Alex
- Date: 2009
- Type: Text , Journal article
- Relation: Optimization Vol. 58, no. 5 (2009), p. 535-560
- Full Text: false
- Reviewed:
- Description: In this article, we continue to study the modified subgradient (MSG) algorithm previously suggested by Gasimov for solving the sharp augmented Lagrangian dual problems. The most important features of this algorithm are those that guarantees a global optimum for a wide class of non-convex optimization problems, generates a strictly increasing sequence of dual values, a property which is not shared by the other subgradient methods and guarantees convergence. The main drawbacks of MSG algorithm, which are typical for many subgradient algorithms, are those that uses an unconstrained global minimum of the augmented Lagrangian function and requires knowing an approximate upper bound of the initial problem to update stepsize parameters. In this study we introduce a new algorithm based on the so-called feasible values and give convergence theorems. The new algorithm does not require to know the optimal value initially and seeks it iteratively beginning with an arbitrary number. It is not necessary to find a global minimum of the augmented Lagrangian for updating the stepsize parameters in the new algorithm. A collection of test problems are used to demonstrate the performance of the new algorithm. © 2009 Taylor & Francis.