Your selections:

5Mathematics
4Statistics
3Nonsmooth optimization
1Cluster analysis
1Constrained optimization
1Convex programming
1Incremental algorithm
1K-means algorithm
1Mathematical optimization
1Nonconvex optimization
1Operations research
1Optimization
1Partition clustering
1Piecewise linear classifiers
1Similarity measure
1Subgradient methods

Show More

Show Less

Format Type

- Bagirov, Adil, Taheri, Sona, Bai, Fusheng, Wu, Zhiyou

**Authors:**Bagirov, Adil , Taheri, Sona , Bai, Fusheng , Wu, Zhiyou**Date:**2019**Type:**Text , Book chapter , Book Chapter**Relation:**Nonsmooth Optimization and Its Applications (part of the International Series of Numerical Mathematics book series) Chapter 2 p. 17-44**Full Text:**false**Reviewed:****Description:**Nonsmooth convex optimization problems with two blocks of variables subject to linear constraints are considered. A new version of the alternating direction method of multipliers is developed for solving these problems. In this method the subproblems are solved approximately. The convergence of the method is studied. New test problems are designed and used to verify the efficiency of the proposed method and to compare it with two versions of the proximal bundle method.

Nonsmooth optimization based algorithms in cluster analysis

- Bagirov, Adil, Mohebi, Ehsan

**Authors:**Bagirov, Adil , Mohebi, Ehsan**Date:**2015**Type:**Text , Book chapter**Relation:**Partitional Clustering Algorithms p. 99-146**Full Text:**false**Reviewed:****Description:**Cluster analysis is an important task in data mining. It deals with the problem of organization of a collection of objects into clusters based on a similarity measure. Various distance functions can be used to define the similarity measure. Cluster analysis problems with the similarity measure defined by the squared Euclidean distance, which is also known as the minimum sum-of-squares clustering, has been studied extensively over the last five decades. L1 and L1 norms have attracted less attention. In this chapter, we consider a nonsmooth nonconvex optimization formulation of the cluster analysis problems. This formulation allows one to easily apply similarity measures defined using different distance functions. Moreover, an efficient incremental algorithm can be designed based on this formulation to solve the clustering problems. We develop incremental algorithms for solving clustering problems where the similarity measure is defined using the L1; L2 and L1 norms. We also consider different algorithms for solving nonsmooth nonconvex optimization problems in cluster analysis. The proposed algorithms are tested using several real world data sets and compared with other similar algorithms.**Description:**Cluster analysis is an important task in data mining. It deals with the problem of organization of a collection of objects into clusters based on a similarity measure. Various distance functions can be used to define the similarity measure. Cluster analysis problems with the similarity measure defined by the squared Euclidean distance, which is also known as the minimum sum-of-squares clustering, has been studied extensively over the last five decades. However, problems with the L

Piecewise linear classifiers based on nonsmooth optimization approaches

- Bagirov, Adil, Kasimbeyli, Refail, Ozturk, Gurkan, Ugon, Julien

**Authors:**Bagirov, Adil , Kasimbeyli, Refail , Ozturk, Gurkan , Ugon, Julien**Date:**2014**Type:**Text , Book chapter**Relation:**Optimization in Science and Engineering p. 1-32**Full Text:**false**Reviewed:****Description:**Nonsmooth optimization provides efficient algorithms for solving many machine learning problems. In particular, nonsmooth optimization approaches to supervised data classification problems lead to the design of very efficient algorithms for their solution. In this chapter, we demonstrate how nonsmooth optimization algorithms can be applied to design efficient piecewise linear classifiers for supervised data classification problems. Such classifiers are developed using a max–min and a polyhedral conic separabilities as well as an incremental approach. We report results of numerical experiments and compare the piecewise linear classifiers with a number of other mainstream classifiers.

Subgradient and bundle methods for nonsmooth optimization

- Makela, Marko, Karmitsa, Napsu, Bagirov, Adil

**Authors:**Makela, Marko , Karmitsa, Napsu , Bagirov, Adil**Date:**2013**Type:**Text , Book chapter**Relation:**Numerical methods for differential equations, optimization, and technological problems p.**Full Text:**false**Reviewed:****Description:**The nonsmooth optimization methods can mainly be divided into two groups: subgradient and bundle methods. Usually, when developing new algorithms and testing them, the comparison is made between similar kinds of methods. The goal of this work is to test and compare different bundle and subgradient methods as well as some hybrids of these two and/or some others. The test set included a large amount of different unconstrained nonsmooth minimization problems, e.g., convex and nonconvex problems, piecewise linear and quadratic problems, and problems with different sizes. Rather than foreground some method over the others, our aim is to get some insight on which method is suitable for certain types of problems.

Machine learning algorithms for analysis of DNA data sets

- Yearwood, John, Bagirov, Adil, Kelarev, Andrei

**Authors:**Yearwood, John , Bagirov, Adil , Kelarev, Andrei**Date:**2012**Type:**Text , Book chapter**Relation:**Machine Learning Algorithms for Problem Solving in Computational Applications: Intelligent Techniques p. 47-58**Relation:**http://purl.org/au-research/grants/arc/LP0990908**Full Text:**false**Reviewed:****Description:**The applications of machine learning algorithms to the analysis of data sets of DNA sequences are very important. The present chapter is devoted to the experimental investigation of applications of several machine learning algorithms for the analysis of a JLA data set consisting of DNA sequences derived from non-coding segments in the junction of the large single copy region and inverted repeat A of the chloroplast genome in Eucalyptus collected by Australian biologists. Data sets of this sort represent a new situation, where sophisticated alignment scores have to be used as a measure of similarity. The alignment scores do not satisfy properties of the Minkowski metric, and new machine learning approaches have to be investigated. The authors' experiments show that machine learning algorithms based on local alignment scores achieve very good agreement with known biological classes for this data set. A new machine learning algorithm based on graph partitioning performed best for clustering of the JLA data set. Our novel k-committees algorithm produced most accurate results for classification. Two new examples of synthetic data sets demonstrate that the authors' k-committees algorithm can outperform both the Nearest Neighbour and k-medoids algorithms simultaneously.

Continuous approximations to subdifferentials

**Authors:**Bagirov, Adil**Date:**2009**Type:**Text , Book chapter**Relation:**Encyclopedia of Optimization Chapter p. 475-482**Full Text:**false

Derivative-free methods for non-smooth optimization

**Authors:**Bagirov, Adil**Date:**2009**Type:**Text , Book chapter**Relation:**Encyclopedia of Optimization Chapter p. 648-655**Full Text:**false**Description:**2003007530

Global optimization : Cutting angle method

- Bagirov, Adil, Beliakov, Gleb

**Authors:**Bagirov, Adil , Beliakov, Gleb**Date:**2009**Type:**Text , Book chapter**Relation:**Encyclopedia of Optimization Chapter p. 1304-1311**Full Text:**false

Nonsmooth optimization approach to clustering

**Authors:**Bagirov, Adil**Date:**2009**Type:**Text , Book chapter**Relation:**Encyclopedia of Optimization Chapter p. 2664-2671**Full Text:**false**Description:**2003007532

Supervised data classification via max-min separability

**Authors:**Ugon, Julien , Bagirov, Adil**Date:**2005**Type:**Text , Book chapter**Relation:**Continuous Optimization: Current Trends and Modern Applications Chapter p. 175-208**Full Text:****Reviewed:****Description:**B1**Description:**2003001268

**Authors:**Ugon, Julien , Bagirov, Adil**Date:**2005**Type:**Text , Book chapter**Relation:**Continuous Optimization: Current Trends and Modern Applications Chapter p. 175-208**Full Text:****Reviewed:****Description:**B1**Description:**2003001268

- «
- ‹
- 1
- ›
- »

Are you sure you would like to clear your session, including search history and login status?