Integrated production system optimization using global optimization techniques
- Authors: Mason, T. L. , Emelle, C. , Van Berkel, J. , Bagirov, Adil , Kampas, F. , Pinter, J. D.
- Date: 2007
- Type: Text , Journal article
- Relation: Journal of Industrial and Management Optimization Vol. 3, no. 2 (May 2007), p. 257-277
- Full Text:
- Reviewed:
- Description: Many optimization problems related to integrated oil and gas production systems are nonconvex and multimodal. Additionally, apart from the innate nonsmoothness of many optimization problems, nonsmooth functions such as minimum and maximum functions may be used to model flow/pressure controllers and cascade mass in the gas gathering and blending networks. In this paper we study the application of different versions of the derivative free Discrete Gradient Method (DGM) as well as the Lipschitz Global Optimizer (LGO) suite to production optimization in integrated oil and gas production systems and their comparison with various local and global solvers used with the General Algebraic Modeling System (GAMS). Four nonconvex and nonsmooth test cases were constructed from a small but realistic integrated gas production system optimization problem. The derivation of the system of equations for the various test cases is also presented. Results demonstrate that DGM is especially effective for solving nonsmooth optimization problems and its two versions are capable global optimization algorithms. We also demonstrate that LGO solves successfully the presented test (as well as other related real-world) problems.
- Description: C1
- Description: 2003004725
Visual tools for analysing evolution, emergence, and error in data streams
- Authors: Hart, Sol , Yearwood, John , Bagirov, Adil
- Date: 2007
- Type: Text , Conference paper
- Relation: Paper presented at 6th IEEE/ACIS International Conference on Computer and Information Science, ICIS 2007, Melbourne, Victoria : 11th-13th July 2007 p. 987-992
- Full Text:
- Description: The relatively new field of stream mining has necessitated the development of robust drift-aware algorithms that provide accurate, real time, data handling capabilities. Tools are needed to assess and diagnose important trends and investigate drift evolution parameters. In this paper, we present two new and novel visualisation techniques, Pixie and Luna graphs, which incorporate salient group statistics coupled with intuitive visual representations of multidimensional groupings over time. Through the novel representations presented here, spatial interactions between temporal divisions can be diagnosed and overall distribution patterns identified. It provides a means of evaluating in non-constrained capacity, commonly constrained evolutionary problems.
- Description: 2003005432
A hybrid neural learning algorithm using evolutionary learning and derivative free local search method
- Authors: Ghosh, Ranadhir , Yearwood, John , Ghosh, Moumita , Bagirov, Adil
- Date: 2006
- Type: Text , Journal article
- Relation: International Journal of Neural Systems Vol. 16, no. 3 (2006), p. 201-213
- Full Text: false
- Reviewed:
- Description: In this paper we investigate a hybrid model based on the Discrete Gradient method and an evolutionary strategy for determining the weights in a feed forward artificial neural network. Also we discuss different variants for hybrid models using the Discrete Gradient method and an evolutionary strategy for determining the weights in a feed forward artificial neural network. The Discrete Gradient method has the advantage of being able to jump over many local minima and find very deep local minima. However, earlier research has shown that a good starting point for the discrete gradient method can improve the quality of the solution point. Evolutionary algorithms are best suited for global optimisation problems. Nevertheless they are cursed with longer training times and often unsuitable for real world application. For optimisation problems such as weight optimisation for ANNs in real world applications the dimensions are large and time complexity is critical. Hence the idea of a hybrid model can be a suitable option. In this paper we propose different fusion strategies for hybrid models combining the evolutionary strategy with the discrete gradient method to obtain an optimal solution much quicker. Three different fusion strategies are discussed: a linear hybrid model, an iterative hybrid model and a restricted local search hybrid model. Comparative results on a range of standard datasets are provided for different fusion hybrid models. © World Scientific Publishing Company.
- Description: C1
- Description: 2003001712
A new nonsmooth optimization algorithm for minimum sum-of-squares clustering problems
- Authors: Bagirov, Adil , Yearwood, John
- Date: 2006
- Type: Text , Journal article
- Relation: European Journal of Operational Research Vol. 170, no. 2 (2006), p. 578-596
- Full Text: false
- Reviewed:
- Description: The minimum sum-of-squares clustering problem is formulated as a problem of nonsmooth, nonconvex optimization, and an algorithm for solving the former problem based on nonsmooth optimization techniques is developed. The issue of applying this algorithm to large data sets is discussed. Results of numerical experiments have been presented which demonstrate the effectiveness of the proposed algorithm. © 2004 Elsevier B.V. All rights reserved.
- Description: C1
- Description: 2003001520
Application of derivative free methods for production optimization
- Authors: Bagirov, Adil , Mason, T. L. , Ghosh, Moumita
- Date: 2006
- Type: Text , Journal article
- Relation: Applied and Computational Mathematics Vol. 5, no. 1 (2006), p. 94-105
- Full Text: false
- Reviewed:
- Description: Continuous gas lift is a high value optimization proposition, where high-pressure gas is injected at various depths, into oil production well to lightened the fluid column and so improve production and recovery. Gas lift optimization models as a surrogate for optimization planning, are usually nonconvex and even nonsmooth. Moreover, in many situations the objective and/or constraint functions in these problems are not known analytically. Most of traditional methods of optimization cannot be applied to solve such problems. Derivative free methods seem to be better choice for solving such problems. In this paper, we compare two different derivative free methods, our variant of the discrete gradient method and the generalized descent method for solving nonlinear gas lift optimization problems. We consider two different gas lift optimization problems. The objective functions in these problems are separable, nonsmooth and nonconvex. Although both algorithms produce satisfactory results, however the discrete gradient method better deals with noisy data and produces better results.
- Description: C1
- Description: 2003001714
Derivative free stochastic discrete gradient method with adaptive mutation
- Authors: Ghosh, Ranadhir , Ghosh, Moumita , Bagirov, Adil
- Date: 2006
- Type: Text , Journal article
- Relation: Advances in Data Mining Vol. 4065, no. (2006), p. 264-278
- Full Text: false
- Reviewed:
- Description: In data mining we come across many problems such as function optimization problem or parameter estimation problem for classifiers for which a good learning algorithm for searching is very much necessary. In this paper we propose a stochastic based derivative free algorithm for unconstrained optimization problem. Many derivative-based local search methods exist which usually stuck into local solution for non-convex optimization problems. On the other hand global search methods are very time consuming and works for only limited number of variables. In this paper we investigate a derivative free multi search gradient based method which overcomes the problems of local minima and produces global solution in less time. We have tested the proposed method on many benchmark dataset in literature and compared the results with other existing algorithms. The results are very promising.
- Description: C1
- Description: 2003001541
Modified global k-means algorithm for clustering in gene expression data sets
- Authors: Bagirov, Adil , Mardaneh, Karim
- Date: 2006
- Type: Text , Conference paper
- Relation: Paper presented at Intelligent Systems for Bioinformatics 2006, proceedings of the AI 2006 Workshop on Intelligent Systems of Bioinformatics, Hobart, Tasmania : 4th December, 2006
- Full Text:
- Reviewed:
- Description: Clustering in gene expression data sets is a challenging problem. Different algorithms for clustering of genes have been proposed. However due to the large number of genes only a few algorithms can be applied for the clustering of samples. k-means algorithm and its different variations are among those algorithms. But these algorithms in general can converge only to local minima and these local minima are significantly different from global solutions as the number of clusters increases. Over the last several years different approaches have been proposed to improve global search properties of k-means algorithm and its performance on large data sets. One of them is the global k-means algorithm. In this paper we develop a new version of the global k-means algorithm: the modified global k-means algorithm which is effective for solving clustering problems in gene expression data sets. We present preliminary computational results using gene expression data sets which demonstrate that the modified k-means algorithm improves and sometimes significantly results by k-means and global k-means algorithms.
- Description: E1
- Description: 2003001713
Non-smooth optimization methods for computation of the conditional value-at-risk and portfolio optimization
- Authors: Beliakov, Gleb , Bagirov, Adil
- Date: 2006
- Type: Text , Journal article
- Relation: Optimization Vol. 55, no. 5-6 (2006), p. 459-479
- Full Text:
- Reviewed:
- Description: We examine numerical performance of various methods of calculation of the Conditional Value-at-risk (CVaR), and portfolio optimization with respect to this risk measure. We concentrate on the method proposed by Rockafellar and Uryasev in (Rockafellar, R.T. and Uryasev, S., 2000, Optimization of conditional value-at-risk. Journal of Risk, 2, 21-41), which converts this problem to that of convex optimization. We compare the use of linear programming techniques against a non-smooth optimization method of the discrete gradient, and establish the supremacy of the latter. We show that non-smooth optimization can be used efficiently for large portfolio optimization, and also examine parallel execution of this method on computer clusters. © 2006 Taylor & Francis.
- Description: C1
- Description: 2003002156
Piecewise partially separable functions and a derivative-free algorithm for large scale nonsmooth optimization
- Authors: Bagirov, Adil , Ugon, Julien
- Date: 2006
- Type: Text , Journal article
- Relation: Journal of Global Optimization Vol. 35, no. 2 (Jun 2006), p. 163-195
- Full Text:
- Reviewed:
- Description: This paper introduces the notion of piecewise partially separable functions and studies their properties. We also consider some of many applications of these functions. Finally, we consider the problem of minimizing of piecewise partially separable functions and develop an algorithm for its solution. This algorithm exploits the structure of such functions. We present the results of preliminary numerical experiments.
- Description: 2003001532
An algorithm for minimizing clustering functions
- Authors: Bagirov, Adil , Ugon, Julien
- Date: 2005
- Type: Text , Journal article
- Relation: Optimization Vol. 54, no. 4-5 (Aug-Oct 2005), p. 351-368
- Full Text:
- Reviewed:
- Description: The problem of cluster analysis is formulated as a problem of nonsmooth, nonconvex optimization. An algorithm for solving the latter optimization problem is developed which allows one to significantly reduce the computational efforts. This algorithm is based on the so-called discrete gradient method. Results of numerical experiments are presented which demonstrate the effectiveness of the proposed algorithm.
- Description: C1
- Description: 2003001266
Comparative analysis of genetic algorithm vs. evolutionary algorithm for hybrid models with discrete gradient method for artificial neural network
- Authors: Ghosh, Ranadhir , Ghosh, Moumita , Yearwood, John , Bagirov, Adil
- Date: 2005
- Type: Text , Conference paper
- Relation: Paper presented at the 11th International Fuzzy Systems Associations World Congress, IFSA 2005, Beijing, China, Volume III, Beijing, China : 28th - 31th July, 2005
- Full Text: false
- Reviewed:
- Description: E1
- Description: 2003001359
Comparative analysis of genetic algorithm, simulated annealing and cutting angle method for artificial neural networks
- Authors: Ghosh, Ranadhir , Ghosh, Moumita , Yearwood, John , Bagirov, Adil
- Date: 2005
- Type: Text , Journal article
- Relation: Machine Learning and Data Mining in Pattern Recognition, Proceedings Vol. 3587, no. (2005), p. 62-70
- Full Text: false
- Reviewed:
- Description: Neural network learning is the main essence of ANN. There are many problems associated with the multiple local minima in neural networks. Global optimization methods are capable of finding global optimal solution. In this paper we investigate and present a comparative study for the effects of probabilistic and deterministic global search method for artificial neural network using fully connected feed forward multi-layered perceptron architecture. We investigate two probabilistic global search method namely Genetic algorithm and Simulated annealing method and a deterministic cutting angle method to find weights in neural network. Experiments were carried out on UCI benchmark dataset.
- Description: C1
- Description: 2003003398
Data mining with combined use of optimization techniques and self-organizing maps for improving risk grouping rules : Application to prostate cancer patients
- Authors: Churilov, Leonid , Bagirov, Adil , Schwartz, Daniel , Smith, Kate , Dally, Michael
- Date: 2005
- Type: Text , Journal article
- Relation: Journal of Management Information Systems Vol. 21, no. 4 (2005), p. 85-100
- Full Text:
- Reviewed:
- Description: Data mining techniques provide a popular and powerful tool set to generate various data-driven classification systems. In this paper, we investigate the combined use of self-organizing maps (SOM) and nonsmooth nonconvex optimization techniques in order to produce a working case of a data-driven risk classification system. The optimization approach strengthens the validity of SOM results, and the improved classification system increases both the quality of prediction and the homogeneity within the risk groups. Accurate classification of prostate cancer patients into risk groups is important to assist in the identification of appropriate treatment paths. We start with the existing rules and aim to improve classification accuracy by identifying inconsistencies utilizing self-organizing maps as a data visualization tool. Then, we progress to the study of assigning prostate cancer patients into homogenous groups with the aim to support future clinical treatment decisions. Using the case of prostate cancer patients grouping, we demonstrate strong potential of data-driven risk classification schemes for addressing the risk grouping issues in more general organizational settings. © 2005 M.E. Sharpe, Inc.
- Description: C1
- Description: 2003001265
Determining regularization parameters for derivative free neural learning
- Authors: Ghosh, Ranadhir , Ghosh, Moumita , Yearwood, John , Bagirov, Adil
- Date: 2005
- Type: Text , Conference paper
- Relation: Paper presented at 4th International Conference, MLDM 2005: Machine Learning and Data Mining in Pattern Recognition, Leipzig, Germany : 9th-11th July 2005 p. 71-79
- Full Text: false
- Description: Derivative free optimization methods have recently gained a lot of attractions for neural learning. The curse of dimensionality for the neural learning problem makes local optimization methods very attractive; however the error surface contains many local minima. Discrete gradient method is a special case of derivative free methods based on bundle methods and has the ability to jump over many local minima. There are two types of problems that are associated with this when local optimization methods are used for neural learning. The first type of problems is initial sensitivity dependence problem- that is commonly solved by using a hybrid model. Our early research has shown that discrete gradient method combining with other global methods such as evolutionary algorithm makes them even more attractive. These types of hybrid models have been studied by other researchers also. Another less mentioned problem is the problem of large weight values for the synaptic connections of the network. Large synaptic weight values often lead to the problem of paralysis and convergence problem especially when a hybrid model is used for fine tuning the learning task. In this paper we study and analyse the effect of different regularization parameters for our objective function to restrict the weight values without compromising the classification accuracy.
- Description: 2003001362
Fusion strategies for neural learning algorithms using evolutionary and discrete gradient approaches
- Authors: Ghosh, Ranadhir , Yearwood, John , Ghosh, Moumita , Bagirov, Adil
- Date: 2005
- Type: Text , Conference paper
- Relation: Paper presented at AIA 2005: International Conference on Artificial Intelligence and Applications, Innsbruck, Austria : 14th - 16th February, 2006
- Full Text: false
- Reviewed:
- Description: In this paper we investigate different variants for hybrid models using the Discrete Gradient method and an evolutionary strategy for determining the weights in a feed forward artificial neural network. The Discrete Gradient method has the advantage of being able to jump over many local minima and find very deep local minima. However, earlier research has shown that a good starting point for the discrete gradient method can improve the quality of the solution point. Evolutionary algorithms are best suited for global optimisation problems. Nevertheless they are cursed with longer training times and often unsuitable for real world application. For optimisation problems such as weight optimisation for ANNs in real world applications the dimensions are large and time complexity is critical. Hence the idea of a hybrid model can be a suitable option. In this paper we propose different fusion strategies for hybrid models combining the evolutionary strategy with the discrete gradient method to obtain an optimal solution much quicker. Three different fusion strategies are discussed: a linear hybrid model, an iterative hybrid model and a restricted local search hybrid model Comparative results on a range of standard datasets are provided for different fusion hybrid models.
- Description: E1
- Description: 2003001365
Global optimization in the summarization of text documents
- Authors: Alyguliev, R. M. , Bagirov, Adil
- Date: 2005
- Type: Text , Journal article
- Relation: Automatic Control and Computer Sciences Vol. 39, no. 6 (2005), p. 42-47
- Full Text: false
- Reviewed:
- Description: In order to ensure minimal redundancy in the summary of a document and the greatest possible coverage of its content a method for the construction of summaries (summarization) based on the clustering of sentences is proposed in the article. Clustering of sentences reduces to a determination of cluster centroids the mathematical realization of which relies on a problem of global optimization. A determination of the number of clusters is one of the complex problems in the clustering procedure. Therefore, an algorithm of stepwise determination of the number of clusters is also proposed in the present study. © 2006 by Allerton Press, Inc.
- Description: C1
Hybridization of neural learning algorithms using evolutionary and discrete gradient approaches
- Authors: Ghosh, Ranadhir , Yearwood, John , Ghosh, Moumita , Bagirov, Adil
- Date: 2005
- Type: Text , Journal article
- Relation: Journal of Computer Science Vol. 1, no. 3 (2005), p. 387-394
- Full Text: false
- Reviewed:
- Description: In this study we investigated a hybrid model based on the Discrete Gradient method and an evolutionary strategy for determining the weights in a feed forward artificial neural network. Also we discuss different variants for hybrid models using the Discrete Gradient method and an evolutionary strategy for determining the weights in a feed forward artificial neural network. The Discrete Gradient method has the advantage of being able to jump over many local minima and find very deep local minima. However, earlier research has shown that a good starting point for the discrete gradient method can improve the quality of the solution point. Evolutionary algorithms are best suited for global optimisation problems. Nevertheless they are cursed with longer training times and often unsuitable for real world application. For optimisation problems such as weight optimisation for ANNs in real world applications the dimensions are large and time complexity is critical. Hence the idea of a hybrid model can be a suitable option. In this study we propose different fusion strategies for hybrid models combining the evolutionary strategy with the discrete gradient method to obtain an optimal solution much quicker. Three different fusion strategies are discussed: a linear hybrid model, an iterative hybrid model and a restricted local search hybrid model. Comparative results on a range of standard datasets are provided for different fusion hybrid models.
- Description: C1
- Description: 2003001357
Local optimization method with global multidimensional search
- Authors: Bagirov, Adil , Rubinov, Alex , Zhang, Jiapu
- Date: 2005
- Type: Text , Journal article
- Relation: Journal of Global Optimization Vol. 32, no. 2 (2005), p. 161-179
- Full Text:
- Reviewed:
- Description: This paper presents a new method for solving global optimization problems. We use a local technique based on the notion of discrete gradients for finding a cone of descent directions and then we use a global cutting angle algorithm for finding global minimum within the intersection of the cone and the feasible region. We present results of numerical experiments with well-known test problems and with the so-called cluster function. These results confirm that the proposed algorithms allows one to find a global minimizer or at least a deep local minimizer of a function with a huge amount of shallow local minima. © Springer 2005.
- Description: C1
- Description: 2003001351
Max-min separability
- Authors: Bagirov, Adil
- Date: 2005
- Type: Text , Journal article
- Relation: Optimization Methods and Software Vol. 20, no. 2-3 (2005), p. 271-290
- Full Text:
- Reviewed:
- Description: We consider the problem of discriminating two finite point sets in the n-dimensional space by a finite number of hyperplanes generating a piecewise linear function. If the intersection of these sets is empty, then they can be strictly separated by a max-min of linear functions. An error function is introduced. This function is nonconvex piecewise linear. We discuss an algorithm for its minimization. The results of numerical experiments using some real-world datasets are presented, which show the effectiveness of the proposed approach.
- Description: C1
- Description: 2003001350
Supervised data classification via max-min separability
- Authors: Ugon, Julien , Bagirov, Adil
- Date: 2005
- Type: Text , Book chapter
- Relation: Continuous Optimization: Current Trends and Modern Applications Chapter p. 175-208
- Full Text:
- Reviewed:
- Description: B1
- Description: 2003001268