Your selections:

11Nonsmooth optimization
60102 Applied Mathematics
50103 Numerical and Computational Mathematics
40802 Computation Theory and Mathematics
4Cluster analysis
4Nonconvex optimization
3Data mining
3Discrete gradient method
3Supervised learning
20801 Artificial Intelligence and Image Processing
20906 Electrical and Electronic Engineering
2Classification
2Clusterwise linear regression
2Clusterwise regression
2DC programming
2Incremental algorithm
2Iterative methods
2Minimization
2Optimization
2Piecewise linear classifier

Show More

Show Less

Format Type

A generalized subgradient method with piecewise linear subproblem

- Bagirov, Adil, Ganjehlou, Asef Nazari, Tor, Hakan, Ugon, Julien

**Authors:**Bagirov, Adil , Ganjehlou, Asef Nazari , Tor, Hakan , Ugon, Julien**Date:**2010**Type:**Text , Journal article**Relation:**Dynamics of Continuous, Discrete and Impulsive Systems Series B: Applications and Algorithms Vol. 17, no. 5 (2010), p. 621-638**Full Text:**false**Reviewed:****Description:**In this paper, a new version of the quasisecant method for nonsmooth nonconvex optimization is developed. Quasisecants are overestimates to the objective function in some neighborhood of a given point. Subgradients are used to obtain quasisecants. We describe classes of nonsmooth functions where quasisecants can be computed explicitly. We show that a descent direction with suffcient decrease must satisfy a set of linear inequalities. In the proposed algorithm this set of linear inequalities is solved by applying the subgradient algorithm to minimize a piecewise linear function. We compare results of numerical experiments between the proposed algorithm and subgradient method. Copyright Â© 2010 Watam Press.

A new modified global k-means algorithm for clustering large data sets

- Bagirov, Adil, Ugon, Julien, Webb, Dean

**Authors:**Bagirov, Adil , Ugon, Julien , Webb, Dean**Date:**2009**Type:**Text , Conference paper**Relation:**Paper presented at XIIIth International Conference : Applied Stochastic Models and Data Analysis, ASMDA 2009, Vilnius, Lithuania : 30th June - 3rd July 2009 p. 1-5**Full Text:**false**Description:**The k-means algorithm and its variations are known to be fast clustering algorithms. However, they are sensitive to the choice of starting points and inefficient for solving clustering problems in large data sets. Recently, in order to resolve difficulties with the choice of starting points, incremental approaches have been developed. The modified global k-means algorithm is based on such an approach. It iteratively adds one cluster center at a time. Numerical experiments show that this algorithm considerably improve the k-means algorithm. However, this algorithm is not suitable for clustering very large data sets. In this paper, a new version of the modified global k-means algorithm is proposed. We introduce an auxiliary cluster function to generate a set of starting points spanning different parts of the data set. We exploit information gathered in previous iterations of the incremental algorithm to reduce its complexity.**Description:**2003007558

A novel piecewise linear classifier based on polyhedral conic and max-min separabilities

- Bagirov, Adil, Ugon, Julien, Webb, Dean, Ozturk, Gurkan, Kasimbeyli, Refail

**Authors:**Bagirov, Adil , Ugon, Julien , Webb, Dean , Ozturk, Gurkan , Kasimbeyli, Refail**Date:**2011**Type:**Text , Journal article**Relation:**TOP Vol. , no. (2011), p. 1-22**Full Text:**false**Reviewed:****Description:**In this paper, an algorithm for finding piecewise linear boundaries between pattern classes is developed. This algorithm consists of two main stages. In the first stage, a polyhedral conic set is used to identify data points which lie inside their classes, and in the second stage we exclude those points to compute a piecewise linear boundary using the remaining data points. Piecewise linear boundaries are computed incrementally starting with one hyperplane. Such an approach allows one to significantly reduce the computational effort in many large data sets. Results of numerical experiments are reported. These results demonstrate that the new algorithm consistently produces a good test set accuracy on most data sets comparing with a number of other mainstream classifiers. Â© 2011 Sociedad de EstadÃstica e InvestigaciÃ³n Operativa.

An algorithm for clusterwise linear regression based on smoothing techniques

- Bagirov, Adil, Ugon, Julien, Mirzayeva, Hijran

**Authors:**Bagirov, Adil , Ugon, Julien , Mirzayeva, Hijran**Date:**2014**Type:**Text , Journal article**Relation:**Optimization Letters Vol. 9, no. 2 (2014), p. 375-390**Full Text:**false**Reviewed:****Description:**We propose an algorithm based on an incremental approach and smoothing techniques to solve clusterwise linear regression (CLR) problems. This algorithm incrementally divides the whole data set into groups which can be easily approximated by one linear regression function. A special procedure is introduced to generate an initial solution for solving global optimization problems at each iteration of the incremental algorithm. Such an approach allows one to find global or approximate global solutions to the CLR problems. The algorithm is tested using several data sets for regression analysis and compared with the multistart and incremental Spath algorithms.

An algorithm for minimizing clustering functions

**Authors:**Bagirov, Adil , Ugon, Julien**Date:**2005**Type:**Text , Journal article**Relation:**Optimization Vol. 54, no. 4-5 (Aug-Oct 2005), p. 351-368**Full Text:****Reviewed:****Description:**The problem of cluster analysis is formulated as a problem of nonsmooth, nonconvex optimization. An algorithm for solving the latter optimization problem is developed which allows one to significantly reduce the computational efforts. This algorithm is based on the so-called discrete gradient method. Results of numerical experiments are presented which demonstrate the effectiveness of the proposed algorithm.**Description:**C1**Description:**2003001266

**Authors:**Bagirov, Adil , Ugon, Julien**Date:**2005**Type:**Text , Journal article**Relation:**Optimization Vol. 54, no. 4-5 (Aug-Oct 2005), p. 351-368**Full Text:****Reviewed:****Description:**The problem of cluster analysis is formulated as a problem of nonsmooth, nonconvex optimization. An algorithm for solving the latter optimization problem is developed which allows one to significantly reduce the computational efforts. This algorithm is based on the so-called discrete gradient method. Results of numerical experiments are presented which demonstrate the effectiveness of the proposed algorithm.**Description:**C1**Description:**2003001266

An efficient algorithm for the incremental construction of a piecewise linear classifier

- Bagirov, Adil, Ugon, Julien, Webb, Dean

**Authors:**Bagirov, Adil , Ugon, Julien , Webb, Dean**Date:**2011**Type:**Text , Journal article**Relation:**Information Systems Vol. 36, no. 4 (2011), p. 782-790**Relation:**http://purl.org/au-research/grants/arc/DP0666061**Full Text:**false**Reviewed:****Description:**In this paper the problem of finding piecewise linear boundaries between sets is considered and is applied for solving supervised data classification problems. An algorithm for the computation of piecewise linear boundaries, consisting of two main steps, is proposed. In the first step sets are approximated by hyperboxes to find so-called "indeterminate" regions between sets. In the second step sets are separated inside these "indeterminate" regions by piecewise linear functions. These functions are computed incrementally starting with a linear function. Results of numerical experiments are reported. These results demonstrate that the new algorithm requires a reasonable training time and it produces consistently good test set accuracy on most data sets comparing with mainstream classifiers. Â© 2010 Elsevier B.V. All rights reserved.

An incremental approach for the construction of a piecewise linear classifier

- Bagirov, Adil, Ugon, Julien, Webb, Dean

**Authors:**Bagirov, Adil , Ugon, Julien , Webb, Dean**Date:**2009**Type:**Text , Conference paper**Relation:**Paper presented at XIIIth International Conference : Applied Stochastic Models and Data Analysis, ASMDA 2009, Vilnius, Lithuania : 30th June - 3rd July 2009 p. 507–511**Relation:**http://purl.org/au-research/grants/arc/DP0666061**Full Text:**false**Description:**In this paper the problem of finding piecewise linear boundaries between sets is considered and is applied for solving supervised data classification problems. An algorithm for the computation of piecewise linear boundaries, consisting of two main steps, is proposed. In the first step sets are approximated by hyperboxes to find so-called “indeterminate” regions between sets. In the second step sets are separated inside these “indeterminate” regions by piecewise linear functions. These functions are computed incrementally starting with a linear function. Results of numerical experiments are reported. These results demonstrate that the new algorithm requires a reasonable training time and it produces consistently good test set accuracy on most data sets comparing with mainstream classifiers.**Description:**2003007559

Classification through incremental max-min separability

- Bagirov, Adil, Ugon, Julien, Webb, Dean, Karasozen, Bulent

**Authors:**Bagirov, Adil , Ugon, Julien , Webb, Dean , Karasozen, Bulent**Date:**2011**Type:**Text , Journal article**Relation:**Pattern Analysis and Applications Vol. 14, no. 2 (2011), p. 165-174**Relation:**http://purl.org/au-research/grants/arc/DP0666061**Full Text:**false**Reviewed:****Description:**Piecewise linear functions can be used to approximate non-linear decision boundaries between pattern classes. Piecewise linear boundaries are known to provide efficient real-time classifiers. However, they require a long training time. Finding piecewise linear boundaries between sets is a difficult optimization problem. Most approaches use heuristics to avoid solving this problem, which may lead to suboptimal piecewise linear boundaries. In this paper, we propose an algorithm for globally training hyperplanes using an incremental approach. Such an approach allows one to find a near global minimizer of the classification error function and to compute as few hyperplanes as needed for separating sets. We apply this algorithm for solving supervised data classification problems and report the results of numerical experiments on real-world data sets. These results demonstrate that the new algorithm requires a reasonable training time and its test set accuracy is consistently good on most data sets compared with mainstream classifiers. © 2010 Springer-Verlag London Limited.

Codifferential method for minimizing nonsmooth DC functions

**Authors:**Bagirov, Adil , Ugon, Julien**Date:**2011**Type:**Text , Journal article**Relation:**Journal of Global Optimization Vol. 50, no. 1 (2011), p. 3-22**Relation:**http://purl.org/au-research/grants/arc/DP0666061**Full Text:**false**Reviewed:****Description:**In this paper, a new algorithm to locally minimize nonsmooth functions represented as a difference of two convex functions (DC functions) is proposed. The algorithm is based on the concept of codifferential. It is assumed that DC decomposition of the objective function is known a priori. We develop an algorithm to compute descent directions using a few elements from codifferential. The convergence of the minimization algorithm is studied and its comparison with different versions of the bundle methods using results of numerical experiments is given. © 2010 Springer Science+Business Media, LLC.

Fast modified global k-means algorithm for incremental cluster construction

- Bagirov, Adil, Ugon, Julien, Webb, Dean

**Authors:**Bagirov, Adil , Ugon, Julien , Webb, Dean**Date:**2011**Type:**Text , Journal article**Relation:**Pattern Recognition Vol. 44, no. 4 (2011), p. 866-876**Relation:**http://purl.org/au-research/grants/arc/DP0666061**Full Text:**false**Reviewed:****Description:**The k-means algorithm and its variations are known to be fast clustering algorithms. However, they are sensitive to the choice of starting points and are inefficient for solving clustering problems in large datasets. Recently, incremental approaches have been developed to resolve difficulties with the choice of starting points. The global k-means and the modified global k-means algorithms are based on such an approach. They iteratively add one cluster center at a time. Numerical experiments show that these algorithms considerably improve the k-means algorithm. However, they require storing the whole affinity matrix or computing this matrix at each iteration. This makes both algorithms time consuming and memory demanding for clustering even moderately large datasets. In this paper, a new version of the modified global k-means algorithm is proposed. We introduce an auxiliary cluster function to generate a set of starting points lying in different parts of the dataset. We exploit information gathered in previous iterations of the incremental algorithm to eliminate the need of computing or storing the whole affinity matrix and thereby to reduce computational effort and memory usage. Results of numerical experiments on six standard datasets demonstrate that the new algorithm is more efficient than the global and the modified global k-means algorithms. Â© 2010 Elsevier Ltd. All rights reserved.

**Authors:**Bagirov, Adil , Ugon, Julien**Date:**2018**Type:**Text , Journal article**Relation:**Optimization Methods and Software Vol. 33, no. 1 (2018), p. 194-219**Full Text:**false**Reviewed:****Description:**The clusterwise linear regression problem is formulated as a nonsmooth nonconvex optimization problem using the squared regression error function. The objective function in this problem is represented as a difference of convex functions. Optimality conditions are derived, and an algorithm is designed based on such a representation. An incremental approach is proposed to generate starting solutions. The algorithm is tested on small to large data sets.

**Authors:**Bagirov, Adil , Ugon, Julien**Date:**2018**Type:**Text , Journal article**Relation:**Optimization Methods and Software Vol. 33, no. 1 (2018), p. 194-219**Relation:**http://purl.org/au-research/grants/arc/DP140103213**Full Text:**false**Reviewed:****Description:**The clusterwise linear regression problem is formulated as a nonsmooth nonconvex optimization problem using the squared regression error function. The objective function in this problem is represented as a difference of convex functions. Optimality conditions are derived, and an algorithm is designed based on such a representation. An incremental approach is proposed to generate starting solutions. The algorithm is tested on small to large data sets. © 2017 Informa UK Limited, trading as Taylor & Francis Group.

Nonsmooth DC programming approach to the minimum sum-of-squares clustering problems

- Bagirov, Adil, Taheri, Sona, Ugon, Julien

**Authors:**Bagirov, Adil , Taheri, Sona , Ugon, Julien**Date:**2016**Type:**Text , Journal article**Relation:**Pattern Recognition Vol. 53, no. (2016), p. 12-24**Relation:**http://purl.org/au-research/grants/arc/DP140103213**Full Text:**false**Reviewed:****Description:**This paper introduces an algorithm for solving the minimum sum-of-squares clustering problems using their difference of convex representations. A non-smooth non-convex optimization formulation of the clustering problem is used to design the algorithm. Characterizations of critical points, stationary points in the sense of generalized gradients and inf-stationary points of the clustering problem are given. The proposed algorithm is tested and compared with other clustering algorithms using large real world data sets. © 2015 Elsevier Ltd. All rights reserved.

Nonsmooth nonconvex optimization approach to clusterwise linear regression problems

- Bagirov, Adil, Ugon, Julien, Mirzayeva, Hijran

**Authors:**Bagirov, Adil , Ugon, Julien , Mirzayeva, Hijran**Date:**2013**Type:**Text , Journal article**Relation:**European Journal of Operational Research Vol. 229, no. 1 (2013), p. 132-142**Full Text:**false**Reviewed:****Description:**Clusterwise regression consists of finding a number of regression functions each approximating a subset of the data. In this paper, a new approach for solving the clusterwise linear regression problems is proposed based on a nonsmooth nonconvex formulation. We present an algorithm for minimizing this nonsmooth nonconvex function. This algorithm incrementally divides the whole data set into groups which can be easily approximated by one linear regression function. A special procedure is introduced to generate a good starting point for solving global optimization problems at each iteration of the incremental algorithm. Such an approach allows one to find global or near global solution to the problem when the data sets are sufficiently dense. The algorithm is compared with the multistart Späth algorithm on several publicly available data sets for regression analysis. © 2013 Elsevier B.V. All rights reserved.**Description:**2003011018

Nonsmooth optimization algorithm for solving clusterwise linear regression problems

- Bagirov, Adil, Ugon, Julien, Mirzayeva, Hijran

**Authors:**Bagirov, Adil , Ugon, Julien , Mirzayeva, Hijran**Date:**2015**Type:**Text , Journal article**Relation:**Journal of Optimization Theory and Applications Vol. 164, no. 3 (2015), p. 755-780**Relation:**http://purl.org/au-research/grants/arc/DP140103213**Full Text:**false**Reviewed:****Description:**Clusterwise linear regression consists of finding a number of linear regression functions each approximating a subset of the data. In this paper, the clusterwise linear regression problem is formulated as a nonsmooth nonconvex optimization problem and an algorithm based on an incremental approach and on the discrete gradient method of nonsmooth optimization is designed to solve it. This algorithm incrementally divides the whole dataset into groups which can be easily approximated by one linear regression function. A special procedure is introduced to generate good starting points for solving global optimization problems at each iteration of the incremental algorithm. The algorithm is compared with the multi-start Spath and the incremental algorithms on several publicly available datasets for regression analysis.

Optimisation of operations of a water distribution system for reduced power usage

- Bagirov, Adil, Ugon, Julien, Barton, Andrew, Briggs, Steven

**Authors:**Bagirov, Adil , Ugon, Julien , Barton, Andrew , Briggs, Steven**Date:**2008**Type:**Text , Conference paper**Relation:**Paper presented at 9th National Conference on Hydraulics in Water Engineering: Hydraulics 2008, Darwin, Northern Territory : 22nd-26th September 2008**Full Text:**false**Description:**There are many improvements to operation that can be made to a water distribution system once it has been constructed and placed in ground. Pipes and associated storages and pumps are typically designed to meet average peak daily demands, offer some capacity for growth, and also allow for some deterioration of performance over time. However, the 'as constructed' performance of the pipeline is invariably different to what was designed on paper, and this is particularly so for anything other than design flows, such as during times of water restrictions when there are significantly reduced flows. Because of this, there remain significant benefits to owners and operators for the adaptive and global optimisation of such systems. The present paper uses the Ouyen subsystem of the Northern Mallee Pipeline, in Victoria, as a case study for the development of an optimisation model. This has been done with the intent of using this model to reduce costs and provide better service to customers on this system. The Ouyen subsystem consists of 1600 km of trunk and distribution pipeline servicing an area of 456,000 Ha. The system includes 2 fixed speed pumps diverting water from the Murray River at Liparoo into two 150 ML balancing storages at Ouyen, 4 variable speed pumps feeding water from the balancing storages into the pipeline system, 2 variable speed pressure booster pumps and 5 town balancing storages. When considering all these components of the system, power consumption becomes an important part of the overall operation. The present paper considers a global optimisation model to minimise power consumption while maintaining reasonable performance of the system. The main components of the model are described including the network structure and the costs functions associated with the system. The final model presents the cost functions associated with the pump scheduling, including the penalties descriptions associated with maintaining appropriate storages levels and pressure bounds within the water distribution network.**Description:**2003006758

Piecewise linear classifiers based on nonsmooth optimization approaches

- Bagirov, Adil, Kasimbeyli, Refail, Ozturk, Gurkan, Ugon, Julien

**Authors:**Bagirov, Adil , Kasimbeyli, Refail , Ozturk, Gurkan , Ugon, Julien**Date:**2014**Type:**Text , Book chapter**Relation:**Optimization in Science and Engineering p. 1-32**Full Text:**false**Reviewed:****Description:**Nonsmooth optimization provides efficient algorithms for solving many machine learning problems. In particular, nonsmooth optimization approaches to supervised data classification problems lead to the design of very efficient algorithms for their solution. In this chapter, we demonstrate how nonsmooth optimization algorithms can be applied to design efficient piecewise linear classifiers for supervised data classification problems. Such classifiers are developed using a max–min and a polyhedral conic separabilities as well as an incremental approach. We report results of numerical experiments and compare the piecewise linear classifiers with a number of other mainstream classifiers.

Piecewise partially separable functions and a derivative-free algorithm for large scale nonsmooth optimization

**Authors:**Bagirov, Adil , Ugon, Julien**Date:**2006**Type:**Text , Journal article**Relation:**Journal of Global Optimization Vol. 35, no. 2 (Jun 2006), p. 163-195**Full Text:****Reviewed:****Description:**This paper introduces the notion of piecewise partially separable functions and studies their properties. We also consider some of many applications of these functions. Finally, we consider the problem of minimizing of piecewise partially separable functions and develop an algorithm for its solution. This algorithm exploits the structure of such functions. We present the results of preliminary numerical experiments.**Description:**2003001532

**Authors:**Bagirov, Adil , Ugon, Julien**Date:**2006**Type:**Text , Journal article**Relation:**Journal of Global Optimization Vol. 35, no. 2 (Jun 2006), p. 163-195**Full Text:****Reviewed:****Description:**This paper introduces the notion of piecewise partially separable functions and studies their properties. We also consider some of many applications of these functions. Finally, we consider the problem of minimizing of piecewise partially separable functions and develop an algorithm for its solution. This algorithm exploits the structure of such functions. We present the results of preliminary numerical experiments.**Description:**2003001532

- Ghosh, Moumita, Ugon, Julien, Ghosh, Ranadhir, Bagirov, Adil

**Authors:**Ghosh, Moumita , Ugon, Julien , Ghosh, Ranadhir , Bagirov, Adil**Date:**2004**Type:**Text , Conference paper**Relation:**Paper presented at ICOTA6: 6th International Conference on Optimization - Techniques and Applications, Ballarat, Victoria : 9th December, 2004**Full Text:**false**Reviewed:****Description:**E1**Description:**2003000864

Supervised data classification via max-min separability

**Authors:**Ugon, Julien , Bagirov, Adil**Date:**2005**Type:**Text , Book chapter**Relation:**Continuous Optimization: Current Trends and Modern Applications Chapter p. 175-208**Full Text:****Reviewed:****Description:**B1**Description:**2003001268

**Authors:**Ugon, Julien , Bagirov, Adil**Date:**2005**Type:**Text , Book chapter**Relation:**Continuous Optimization: Current Trends and Modern Applications Chapter p. 175-208**Full Text:****Reviewed:****Description:**B1**Description:**2003001268

- «
- ‹
- 1
- ›
- »

Are you sure you would like to clear your session, including search history and login status?