An algorithm for minimizing clustering functions
- Authors: Bagirov, Adil , Ugon, Julien
- Date: 2005
- Type: Text , Journal article
- Relation: Optimization Vol. 54, no. 4-5 (Aug-Oct 2005), p. 351-368
- Full Text:
- Reviewed:
- Description: The problem of cluster analysis is formulated as a problem of nonsmooth, nonconvex optimization. An algorithm for solving the latter optimization problem is developed which allows one to significantly reduce the computational efforts. This algorithm is based on the so-called discrete gradient method. Results of numerical experiments are presented which demonstrate the effectiveness of the proposed algorithm.
- Description: C1
- Description: 2003001266
Minimization of the sum of minima of convex functions and its application to clustering
- Authors: Rubinov, Alex , Soukhoroukova, Nadejda , Ugon, Julien
- Date: 2005
- Type: Text , Book chapter
- Relation: Continuous Optimization Chapter p. 409-434
- Full Text:
- Description: We study functions that can be represented as the sum of minima of convex functions. Minimization of such functions can be used for approximation of finite sets and their clustering. We suggest to use the local discrete gradient (DG) method [Bag99] and the hybrid method between the cutting angle method and the discrete gradient method (DG+CAM) [BRZ05b] for the minimization of these functions. We report and analyze the results of numerical experiments.
- Description: 2003004082
Piecewise partially separable functions and a derivative-free algorithm for large scale nonsmooth optimization
- Authors: Bagirov, Adil , Ugon, Julien
- Date: 2006
- Type: Text , Journal article
- Relation: Journal of Global Optimization Vol. 35, no. 2 (Jun 2006), p. 163-195
- Full Text:
- Reviewed:
- Description: This paper introduces the notion of piecewise partially separable functions and studies their properties. We also consider some of many applications of these functions. Finally, we consider the problem of minimizing of piecewise partially separable functions and develop an algorithm for its solution. This algorithm exploits the structure of such functions. We present the results of preliminary numerical experiments.
- Description: 2003001532
Nonsmooth optimization algorithm for solving clusterwise linear regression problems
- Authors: Bagirov, Adil , Ugon, Julien , Mirzayeva, Hijran
- Date: 2015
- Type: Text , Journal article
- Relation: Journal of Optimization Theory and Applications Vol. 164, no. 3 (2015), p. 755-780
- Relation: http://purl.org/au-research/grants/arc/DP140103213
- Full Text: false
- Reviewed:
- Description: Clusterwise linear regression consists of finding a number of linear regression functions each approximating a subset of the data. In this paper, the clusterwise linear regression problem is formulated as a nonsmooth nonconvex optimization problem and an algorithm based on an incremental approach and on the discrete gradient method of nonsmooth optimization is designed to solve it. This algorithm incrementally divides the whole dataset into groups which can be easily approximated by one linear regression function. A special procedure is introduced to generate good starting points for solving global optimization problems at each iteration of the incremental algorithm. The algorithm is compared with the multi-start Spath and the incremental algorithms on several publicly available datasets for regression analysis.