Limited Memory Bundle Method for Clusterwise Linear Regression
- Authors: Karmitsa, Napsu , Bagirov, Adil , Taheri, Sona , Joki, Kaisa
- Date: 2022
- Type: Text , Book chapter
- Relation: Intelligent Systems, Control and Automation: Science and Engineering p. 109-122
- Relation: http://purl.org/au-research/grants/arc/DP190100580
- Full Text: false
- Reviewed:
- Description: A clusterwise linear regression problem consists of finding a number of linear functions each approximating a subset of the given data. In this paper, the limited memory bundle method is modified and combined with the incremental approach to solve this problem using its nonsmooth optimization formulation. The main contribution of the proposed method is to obtain a fast solution time for large-scale clusterwise linear regression problems. The proposed algorithm is tested on small and large real-world data sets and compared with other algorithms for clusterwise linear regression. Numerical results demonstrate that the proposed algorithm is especially efficient in data sets with large numbers of data points and input variables. © 2022, Springer Nature Switzerland AG.
Bundle methods for nonsmooth DC optimization
- Authors: Joki, Kaisa , Bagirov, Adil
- Date: 2020
- Type: Text , Book chapter
- Relation: Numerical Nonsmooth Optimization: State of the Art Algorithms Chapter 8 p. 263-296
- Full Text: false
- Reviewed:
- Description: This chapter is devoted to algorithms for solving nonsmooth unconstrained difference of convex optimization problems. Different types of stationarity conditions are discussed and the relationship between sets of different stationary points (critical, Clarke stationary and inf-stationary) is established. Bundle methods are developed based on a nonconvex piecewise linear model of the objective function and the convergence of these methods is studied. Numerical results are presented to demonstrate the performance of the methods. © Springer Nature Switzerland AG 2020.
Discrete gradient methods
- Authors: Bagirov, Adil , Taheri, Sona , Karmitsa, Napsu
- Date: 2020
- Type: Text , Book chapter
- Relation: Numerical Nonsmooth Optimization: State of the Art Algorithms p. 621-654
- Full Text: false
- Reviewed:
- Description: In this chapter, the notion of a discrete gradient is introduced and it is shown that the discrete gradients can be used to approximate subdifferentials of a broad class of nonsmooth functions. Two methods based on such approximations, more specifically, the discrete gradient method (DGM) and its limited memory version (LDGB), are described. These methods are semi derivative-free methods for solving nonsmooth and, in general, nonconvex optimization problems. The performance of the methods is demonstrated using some academic test problems. © Springer Nature Switzerland AG 2020.
Final words
- Authors: Bagirov, Adil , Gaudioso, Manlio , Karmitsa, Napsu , Mäkelä, Marko , Taheri, Sona
- Date: 2020
- Type: Text , Book chapter
- Relation: Numerical Nonsmooth Optimization: State of the Art Algorithms p. 693-694
- Full Text: false
- Reviewed:
Introduction
- Authors: Bagirov, Adil , Gaudioso, Manlio , Karmitsa, Napsu , Mäkelä, Marko , Taheri, Sona
- Date: 2020
- Type: Text , Book chapter
- Relation: Numerical Nonsmooth Optimization: State of the Art Algorithms p. 1-16
- Full Text: false
- Reviewed:
- Description: Nonsmooth optimization (NSO) is among the most challenging tasks in the field of mathematical programming. It addresses optimization problems where objective and/or constraint functions have discontinuous gradients. NSO problems arise in many real life applications. Moreover, some smooth optimization techniques like different decomposition methods, dual formulations and exact penalty methods may lead us to solve NSO problems being either smaller in dimension or simpler in structure. In addition, some optimization problems may be analytically smooth but numerically nonsmooth. This is the case, for instance, with noisy input data and so-called stiff problems, which are numerically unstable and behave like nonsmooth problems. © Springer Nature Switzerland AG 2020.
An approximate ADMM for solving linearly constrained nonsmooth optimization problems with two blocks of variables
- Authors: Bagirov, Adil , Taheri, Sona , Bai, Fusheng , Wu, Zhiyou
- Date: 2019
- Type: Text , Book chapter
- Relation: Nonsmooth Optimization and Its Applications (part of the International Series of Numerical Mathematics book series) Chapter 2 p. 17-44
- Full Text: false
- Reviewed:
- Description: Nonsmooth convex optimization problems with two blocks of variables subject to linear constraints are considered. A new version of the alternating direction method of multipliers is developed for solving these problems. In this method the subproblems are solved approximately. The convergence of the method is studied. New test problems are designed and used to verify the efficiency of the proposed method and to compare it with two versions of the proximal bundle method.
Nonsmooth optimization based algorithms in cluster analysis
- Authors: Bagirov, Adil , Mohebi, Ehsan
- Date: 2015
- Type: Text , Book chapter
- Relation: Partitional Clustering Algorithms p. 99-146
- Full Text: false
- Reviewed:
- Description: Cluster analysis is an important task in data mining. It deals with the problem of organization of a collection of objects into clusters based on a similarity measure. Various distance functions can be used to define the similarity measure. Cluster analysis problems with the similarity measure defined by the squared Euclidean distance, which is also known as the minimum sum-of-squares clustering, has been studied extensively over the last five decades. L1 and L1 norms have attracted less attention. In this chapter, we consider a nonsmooth nonconvex optimization formulation of the cluster analysis problems. This formulation allows one to easily apply similarity measures defined using different distance functions. Moreover, an efficient incremental algorithm can be designed based on this formulation to solve the clustering problems. We develop incremental algorithms for solving clustering problems where the similarity measure is defined using the L1; L2 and L1 norms. We also consider different algorithms for solving nonsmooth nonconvex optimization problems in cluster analysis. The proposed algorithms are tested using several real world data sets and compared with other similar algorithms.
- Description: Cluster analysis is an important task in data mining. It deals with the problem of organization of a collection of objects into clusters based on a similarity measure. Various distance functions can be used to define the similarity measure. Cluster analysis problems with the similarity measure defined by the squared Euclidean distance, which is also known as the minimum sum-of-squares clustering, has been studied extensively over the last five decades. However, problems with the L
Piecewise linear classifiers based on nonsmooth optimization approaches
- Authors: Bagirov, Adil , Kasimbeyli, Refail , Ozturk, Gurkan , Ugon, Julien
- Date: 2014
- Type: Text , Book chapter
- Relation: Optimization in Science and Engineering p. 1-32
- Full Text: false
- Reviewed:
- Description: Nonsmooth optimization provides efficient algorithms for solving many machine learning problems. In particular, nonsmooth optimization approaches to supervised data classification problems lead to the design of very efficient algorithms for their solution. In this chapter, we demonstrate how nonsmooth optimization algorithms can be applied to design efficient piecewise linear classifiers for supervised data classification problems. Such classifiers are developed using a max–min and a polyhedral conic separabilities as well as an incremental approach. We report results of numerical experiments and compare the piecewise linear classifiers with a number of other mainstream classifiers.
Subgradient and bundle methods for nonsmooth optimization
- Authors: Makela, Marko , Karmitsa, Napsu , Bagirov, Adil
- Date: 2013
- Type: Text , Book chapter
- Relation: Numerical methods for differential equations, optimization, and technological problems p.
- Full Text: false
- Reviewed:
- Description: The nonsmooth optimization methods can mainly be divided into two groups: subgradient and bundle methods. Usually, when developing new algorithms and testing them, the comparison is made between similar kinds of methods. The goal of this work is to test and compare different bundle and subgradient methods as well as some hybrids of these two and/or some others. The test set included a large amount of different unconstrained nonsmooth minimization problems, e.g., convex and nonconvex problems, piecewise linear and quadratic problems, and problems with different sizes. Rather than foreground some method over the others, our aim is to get some insight on which method is suitable for certain types of problems.
Machine learning algorithms for analysis of DNA data sets
- Authors: Yearwood, John , Bagirov, Adil , Kelarev, Andrei
- Date: 2012
- Type: Text , Book chapter
- Relation: Machine Learning Algorithms for Problem Solving in Computational Applications: Intelligent Techniques p. 47-58
- Relation: http://purl.org/au-research/grants/arc/LP0990908
- Full Text: false
- Reviewed:
- Description: The applications of machine learning algorithms to the analysis of data sets of DNA sequences are very important. The present chapter is devoted to the experimental investigation of applications of several machine learning algorithms for the analysis of a JLA data set consisting of DNA sequences derived from non-coding segments in the junction of the large single copy region and inverted repeat A of the chloroplast genome in Eucalyptus collected by Australian biologists. Data sets of this sort represent a new situation, where sophisticated alignment scores have to be used as a measure of similarity. The alignment scores do not satisfy properties of the Minkowski metric, and new machine learning approaches have to be investigated. The authors' experiments show that machine learning algorithms based on local alignment scores achieve very good agreement with known biological classes for this data set. A new machine learning algorithm based on graph partitioning performed best for clustering of the JLA data set. Our novel k-committees algorithm produced most accurate results for classification. Two new examples of synthetic data sets demonstrate that the authors' k-committees algorithm can outperform both the Nearest Neighbour and k-medoids algorithms simultaneously.
Continuous approximations to subdifferentials
- Authors: Bagirov, Adil
- Date: 2009
- Type: Text , Book chapter
- Relation: Encyclopedia of Optimization Chapter p. 475-482
- Full Text: false
Derivative-free methods for non-smooth optimization
- Authors: Bagirov, Adil
- Date: 2009
- Type: Text , Book chapter
- Relation: Encyclopedia of Optimization Chapter p. 648-655
- Full Text: false
- Description: 2003007530
Global optimization : Cutting angle method
- Authors: Bagirov, Adil , Beliakov, Gleb
- Date: 2009
- Type: Text , Book chapter
- Relation: Encyclopedia of Optimization Chapter p. 1304-1311
- Full Text: false
Nonsmooth optimization approach to clustering
- Authors: Bagirov, Adil
- Date: 2009
- Type: Text , Book chapter
- Relation: Encyclopedia of Optimization Chapter p. 2664-2671
- Full Text: false
- Description: 2003007532
Supervised data classification via max-min separability
- Authors: Ugon, Julien , Bagirov, Adil
- Date: 2005
- Type: Text , Book chapter
- Relation: Continuous Optimization: Current Trends and Modern Applications Chapter p. 175-208
- Full Text:
- Reviewed:
- Description: B1
- Description: 2003001268