Globally convergent algorithms for solving unconstrained optimization problems
- Authors: Taheri, Sona , Mammadov, Musa , Seifollahi, Sattar
- Date: 2013
- Type: Text , Journal article
- Relation: Optimization Vol. , no. (2013), p. 1-15
- Full Text:
- Reviewed:
- Description: New algorithms for solving unconstrained optimization problems are presented based on the idea of combining two types of descent directions: the direction of anti-gradient and either the Newton or quasi-Newton directions. The use of latter directions allows one to improve the convergence rate. Global and superlinear convergence properties of these algorithms are established. Numerical experiments using some unconstrained test problems are reported. Also, the proposed algorithms are compared with some existing similar methods using results of experiments. This comparison demonstrates the efficiency of the proposed combined methods.
A simulated annealing-based maximum-margin clustering algorithm
- Authors: Seifollahi, Sattar , Bagirov, Adil , Borzeshi, Ehsan , Piccardi, Massimo
- Date: 2019
- Type: Text , Journal article
- Relation: Computational Intelligence Vol. 35, no. 1 (2019), p. 23-41
- Full Text:
- Reviewed:
- Description: Maximum-margin clustering is an extension of the support vector machine (SVM) to clustering. It partitions a set of unlabeled data into multiple groups by finding hyperplanes with the largest margins. Although existing algorithms have shown promising results, there is no guarantee of convergence of these algorithms to global solutions due to the nonconvexity of the optimization problem. In this paper, we propose a simulated annealing-based algorithm that is able to mitigate the issue of local minima in the maximum-margin clustering problem. The novelty of our algorithm is twofold, ie, (i) it comprises a comprehensive cluster modification scheme based on simulated annealing, and (ii) it introduces a new approach based on the combination of k-means++ and SVM at each step of the annealing process. More precisely, k-means++ is initially applied to extract subsets of the data points. Then, an unsupervised SVM is applied to improve the clustering results. Experimental results on various benchmark data sets (of up to over a million points) give evidence that the proposed algorithm is more effective at solving the clustering problem than a number of popular clustering algorithms.
Attribute weighted Naive Bayes classifier using a local optimization
- Authors: Taheri, Sona , Yearwood, John , Mammadov, Musa , Seifollahi, Sattar
- Date: 2013
- Type: Text , Journal article
- Relation: Neural Computing & Applications Vol.24, no.5 (2013), p.995-1002
- Full Text:
- Reviewed:
- Description: The Naive Bayes classifier is a popular classification technique for data mining and machine learning. It has been shown to be very effective on a variety of data classification problems. However, the strong assumption that all attributes are conditionally independent given the class is often violated in real-world applications. Numerous methods have been proposed in order to improve the performance of the Naive Bayes classifier by alleviating the attribute independence assumption. However, violation of the independence assumption can increase the expected error. Another alternative is assigning the weights for attributes. In this paper, we propose a novel attribute weighted Naive Bayes classifier by considering weights to the conditional probabilities. An objective function is modeled and taken into account, which is based on the structure of the Naive Bayes classifier and the attribute weights. The optimal weights are determined by a local optimization method using the quasisecant method. In the proposed approach, the Naive Bayes classifier is taken as a starting point. We report the results of numerical experiments on several real-world data sets in binary classification, which show the efficiency of the proposed method.