Parallelization of the discrete gradient method of non-smooth optimization and its applications
- Authors: Beliakov, Gleb , Tobon, Monsalve , Bagirov, Adil
- Date: 2003
- Type: Text , Conference paper
- Relation: Paper presented at Computational Science ICCS 2003 Conference, Melbourne : 2nd June, 2003
- Full Text: false
- Reviewed:
- Description: We investigate parallelization and performance of the discrete gradient method of nonsmooth optimization. This derivative free method is shown to be an effective optimization tool, able to skip many shallow local minima of nonconvex nondifferentiable objective functions. Although this is a sequential iterative method, we were able to parallelize critical steps of the algorithm, and this lead to a significant improvement in performance on multiprocessor computer clusters. We applied this method to a difficult polyatomic clusters problem in computational chemistry, and found this method to outperform other algorithms.
- Description: E1
- Description: 2003000435
An application of novel clustering technique for information security
- Authors: Beliakov, Gleb , Yearwood, John , Kelarev, Andrei
- Date: 2011
- Type: Text , Conference paper
- Relation: Applications and Techniques in Information Security Workshop p. 5-11
- Full Text: false
- Reviewed:
- Description: This article presents experimental results devoted to a new application of the novel clustering technique introduced by the authors recently. Our aim is to facilitate the application of robust and stable consensus functions in information security, where it is often necessary to process large data sets and monitor outcomes in real time, as it is required, for example, for intrusion detection. Here we concentrate on the particular case of application to profiling of phishing websites. First, we apply several independent clustering algorithms to a randomized sample of data to obtain independent initial clusterings. Silhouette index is used to determine the number of clusters. Second, we use a consensus function to combine these independent clusterings into one consensus clustering . Feature ranking is used to select a subset of features for the consensus function. Third, we train fast supervised classification algorithms on the resulting consensus clustering in order to enable them to process the whole large data set as well as new data. The precision and recall of classifiers at the final stage of this scheme are critical for effectiveness of the whole procedure. We investigated various combinations of three consensus functions, Cluster-Based Graph Formulation (CBGF), Hybrid Bipartite Graph Formulation (HBGF), and Instance-Based Graph Formulation (IBGF) and a variety of supervised classification algorithms. The best precision and recall have been obtained by the combination of the HBGF consensus function and the SMO classifier with the polynomial kernel.
- Description: 2003009195