Implementation of novel methods of global and nonsmooth optimization : GANSO programming library
- Authors: Beliakov, Gleb , Ugon, Julien
- Date: 2007
- Type: Text , Journal article
- Relation: Optimization Vol. 56, no. 5-6 (2007), p. 543-546
- Full Text:
- Reviewed:
- Description: We discuss the implementation of a number of modern methods of global and nonsmooth continuous optimization, based on the ideas of Rubinov, in a programming library GANSO. GANSO implements the derivative-free bundle method, the extended cutting angle method, dynamical system-based optimization and their various combinations and heuristics. We outline the main ideas behind each method, and report on the interfacing with Matlab and Maple packages.
- Description: C1
- Description: 2003004865
A generalized subgradient method with piecewise linear subproblem
- Authors: Bagirov, Adil , Ganjehlou, Asef Nazari , Tor, Hakan , Ugon, Julien
- Date: 2010
- Type: Text , Journal article
- Relation: Dynamics of Continuous, Discrete and Impulsive Systems Series B: Applications and Algorithms Vol. 17, no. 5 (2010), p. 621-638
- Full Text: false
- Reviewed:
- Description: In this paper, a new version of the quasisecant method for nonsmooth nonconvex optimization is developed. Quasisecants are overestimates to the objective function in some neighborhood of a given point. Subgradients are used to obtain quasisecants. We describe classes of nonsmooth functions where quasisecants can be computed explicitly. We show that a descent direction with suffcient decrease must satisfy a set of linear inequalities. In the proposed algorithm this set of linear inequalities is solved by applying the subgradient algorithm to minimize a piecewise linear function. We compare results of numerical experiments between the proposed algorithm and subgradient method. Copyright © 2010 Watam Press.
Piecewise partially separable functions and a derivative-free algorithm for large scale nonsmooth optimization
- Authors: Bagirov, Adil , Ugon, Julien
- Date: 2006
- Type: Text , Journal article
- Relation: Journal of Global Optimization Vol. 35, no. 2 (Jun 2006), p. 163-195
- Full Text:
- Reviewed:
- Description: This paper introduces the notion of piecewise partially separable functions and studies their properties. We also consider some of many applications of these functions. Finally, we consider the problem of minimizing of piecewise partially separable functions and develop an algorithm for its solution. This algorithm exploits the structure of such functions. We present the results of preliminary numerical experiments.
- Description: 2003001532
Fast modified global k-means algorithm for incremental cluster construction
- Authors: Bagirov, Adil , Ugon, Julien , Webb, Dean
- Date: 2011
- Type: Text , Journal article
- Relation: Pattern Recognition Vol. 44, no. 4 (2011), p. 866-876
- Relation: http://purl.org/au-research/grants/arc/DP0666061
- Full Text: false
- Reviewed:
- Description: The k-means algorithm and its variations are known to be fast clustering algorithms. However, they are sensitive to the choice of starting points and are inefficient for solving clustering problems in large datasets. Recently, incremental approaches have been developed to resolve difficulties with the choice of starting points. The global k-means and the modified global k-means algorithms are based on such an approach. They iteratively add one cluster center at a time. Numerical experiments show that these algorithms considerably improve the k-means algorithm. However, they require storing the whole affinity matrix or computing this matrix at each iteration. This makes both algorithms time consuming and memory demanding for clustering even moderately large datasets. In this paper, a new version of the modified global k-means algorithm is proposed. We introduce an auxiliary cluster function to generate a set of starting points lying in different parts of the dataset. We exploit information gathered in previous iterations of the incremental algorithm to eliminate the need of computing or storing the whole affinity matrix and thereby to reduce computational effort and memory usage. Results of numerical experiments on six standard datasets demonstrate that the new algorithm is more efficient than the global and the modified global k-means algorithms. © 2010 Elsevier Ltd. All rights reserved.
Codifferential method for minimizing nonsmooth DC functions
- Authors: Bagirov, Adil , Ugon, Julien
- Date: 2011
- Type: Text , Journal article
- Relation: Journal of Global Optimization Vol. 50, no. 1 (2011), p. 3-22
- Relation: http://purl.org/au-research/grants/arc/DP0666061
- Full Text: false
- Reviewed:
- Description: In this paper, a new algorithm to locally minimize nonsmooth functions represented as a difference of two convex functions (DC functions) is proposed. The algorithm is based on the concept of codifferential. It is assumed that DC decomposition of the objective function is known a priori. We develop an algorithm to compute descent directions using a few elements from codifferential. The convergence of the minimization algorithm is studied and its comparison with different versions of the bundle methods using results of numerical experiments is given. © 2010 Springer Science+Business Media, LLC.
Detecting K-complexes for sleep stage identification using nonsmooth optimization
- Authors: Moloney, David , Sukhorukova, Nadezda , Vamplew, Peter , Ugon, Julien , Li, Gang , Beliakov, Gleb , Philippe, Carole , Amiel, Hélène , Ugon, Adrien
- Date: 2012
- Type: Text , Journal article
- Relation: ANZIAM Journal Vol. 52, no. 4 (2012), p. 319-332
- Full Text:
- Reviewed:
- Description: The process of sleep stage identification is a labour-intensive task that involves the specialized interpretation of the polysomnographic signals captured from a patient's overnight sleep session. Automating this task has proven to be challenging for data mining algorithms because of noise, complexity and the extreme size of data. In this paper we apply nonsmooth optimization to extract key features that lead to better accuracy. We develop a specific procedure for identifying K-complexes, a special type of brain wave crucial for distinguishing sleep stages. The procedure contains two steps. We first extract "easily classified" K-complexes, and then apply nonsmooth optimization methods to extract features from the remaining data and refine the results from the first step. Numerical experiments show that this procedure is efficient for detecting K-complexes. It is also found that most classification methods perform significantly better on the extracted features. © 2012 Australian Mathematical Society.
An algorithm for clusterwise linear regression based on smoothing techniques
- Authors: Bagirov, Adil , Ugon, Julien , Mirzayeva, Hijran
- Date: 2014
- Type: Text , Journal article
- Relation: Optimization Letters Vol. 9, no. 2 (2014), p. 375-390
- Full Text: false
- Reviewed:
- Description: We propose an algorithm based on an incremental approach and smoothing techniques to solve clusterwise linear regression (CLR) problems. This algorithm incrementally divides the whole data set into groups which can be easily approximated by one linear regression function. A special procedure is introduced to generate an initial solution for solving global optimization problems at each iteration of the incremental algorithm. Such an approach allows one to find global or approximate global solutions to the CLR problems. The algorithm is tested using several data sets for regression analysis and compared with the multistart and incremental Spath algorithms.
A novel piecewise linear classifier based on polyhedral conic and max-min separabilities
- Authors: Bagirov, Adil , Ugon, Julien , Webb, Dean , Ozturk, Gurkan , Kasimbeyli, Refail
- Date: 2011
- Type: Text , Journal article
- Relation: TOP Vol.21, no.1 (2011), p. 1-22
- Full Text: false
- Reviewed:
- Description: In this paper, an algorithm for finding piecewise linear boundaries between pattern classes is developed. This algorithm consists of two main stages. In the first stage, a polyhedral conic set is used to identify data points which lie inside their classes, and in the second stage we exclude those points to compute a piecewise linear boundary using the remaining data points. Piecewise linear boundaries are computed incrementally starting with one hyperplane. Such an approach allows one to significantly reduce the computational effort in many large data sets. Results of numerical experiments are reported. These results demonstrate that the new algorithm consistently produces a good test set accuracy on most data sets comparing with a number of other mainstream classifiers. © 2011 Sociedad de EstadÃstica e Investigación Operativa.
Piecewise linear classifiers based on nonsmooth optimization approaches
- Authors: Bagirov, Adil , Kasimbeyli, Refail , Ozturk, Gurkan , Ugon, Julien
- Date: 2014
- Type: Text , Book chapter
- Relation: Optimization in Science and Engineering p. 1-32
- Full Text: false
- Reviewed:
- Description: Nonsmooth optimization provides efficient algorithms for solving many machine learning problems. In particular, nonsmooth optimization approaches to supervised data classification problems lead to the design of very efficient algorithms for their solution. In this chapter, we demonstrate how nonsmooth optimization algorithms can be applied to design efficient piecewise linear classifiers for supervised data classification problems. Such classifiers are developed using a max–min and a polyhedral conic separabilities as well as an incremental approach. We report results of numerical experiments and compare the piecewise linear classifiers with a number of other mainstream classifiers.
Nonsmooth DC programming approach to clusterwise linear regression : Optimality conditions and algorithms
- Authors: Bagirov, Adil , Ugon, Julien
- Date: 2018
- Type: Text , Journal article
- Relation: Optimization Methods and Software Vol. 33, no. 1 (2018), p. 194-219
- Relation: http://purl.org/au-research/grants/arc/DP140103213
- Full Text: false
- Reviewed:
- Description: The clusterwise linear regression problem is formulated as a nonsmooth nonconvex optimization problem using the squared regression error function. The objective function in this problem is represented as a difference of convex functions. Optimality conditions are derived, and an algorithm is designed based on such a representation. An incremental approach is proposed to generate starting solutions. The algorithm is tested on small to large data sets. © 2017 Informa UK Limited, trading as Taylor & Francis Group.
Nonsmooth optimization algorithm for solving clusterwise linear regression problems
- Authors: Bagirov, Adil , Ugon, Julien , Mirzayeva, Hijran
- Date: 2015
- Type: Text , Journal article
- Relation: Journal of Optimization Theory and Applications Vol. 164, no. 3 (2015), p. 755-780
- Relation: http://purl.org/au-research/grants/arc/DP140103213
- Full Text: false
- Reviewed:
- Description: Clusterwise linear regression consists of finding a number of linear regression functions each approximating a subset of the data. In this paper, the clusterwise linear regression problem is formulated as a nonsmooth nonconvex optimization problem and an algorithm based on an incremental approach and on the discrete gradient method of nonsmooth optimization is designed to solve it. This algorithm incrementally divides the whole dataset into groups which can be easily approximated by one linear regression function. A special procedure is introduced to generate good starting points for solving global optimization problems at each iteration of the incremental algorithm. The algorithm is compared with the multi-start Spath and the incremental algorithms on several publicly available datasets for regression analysis.
A new modified global k-means algorithm for clustering large data sets
- Authors: Bagirov, Adil , Ugon, Julien , Webb, Dean
- Date: 2009
- Type: Text , Conference paper
- Relation: Paper presented at XIIIth International Conference : Applied Stochastic Models and Data Analysis, ASMDA 2009, Vilnius, Lithuania : 30th June - 3rd July 2009 p. 1-5
- Full Text: false
- Description: The k-means algorithm and its variations are known to be fast clustering algorithms. However, they are sensitive to the choice of starting points and inefficient for solving clustering problems in large data sets. Recently, in order to resolve difficulties with the choice of starting points, incremental approaches have been developed. The modified global k-means algorithm is based on such an approach. It iteratively adds one cluster center at a time. Numerical experiments show that this algorithm considerably improve the k-means algorithm. However, this algorithm is not suitable for clustering very large data sets. In this paper, a new version of the modified global k-means algorithm is proposed. We introduce an auxiliary cluster function to generate a set of starting points spanning different parts of the data set. We exploit information gathered in previous iterations of the incremental algorithm to reduce its complexity.
- Description: 2003007558