/

Default Site
  • Change Site
  • Default Site
  • Advanced Search
  • Expert Search
  • Sign In
    • Help
    • Search History
    • Clear Session
  • Browse
    • Entire Repository  
    • Recent Additions
    • Communities & Collections
    • By Title
    • By Creator
    • By Subject
    • By Type
    • Most Accessed Papers
    • Most Accessed Items
    • Most Accessed Authors
  • Quick Collection  
Sign In
  • Help
  • Search History
  • Clear Session

Showing items 1 - 4 of 4

Your selections:

  • Nonsmooth optimization
  • 0801 Artificial Intelligence and Image Processing
Creator
2Taheri, Sona 1Asadi, Soodabeh 1Karmitsa, Napsu 1Kasimbeyli, Refail 1Mohebi, Ehsan 1Ozturk, Gurkan
Subject
20806 Information Systems 2Cluster analysis 2Nonconvex optimization 10102 Applied Mathematics 10103 Numerical and Computational Mathematics 10802 Computation Theory and Mathematics 10906 Electrical and Electronic Engineering 11702 Cognitive Science 1Bundle methods 1Classification 1Classification (of information) 1DC optimization 1Derivative-free methods 1Discrete gradient method 1Errors 1Gradient methods 1Incremental approach 1Limited memory methods
Show More
Show Less
Facets
Creator
2Taheri, Sona 1Asadi, Soodabeh 1Karmitsa, Napsu 1Kasimbeyli, Refail 1Mohebi, Ehsan 1Ozturk, Gurkan
Subject
20806 Information Systems 2Cluster analysis 2Nonconvex optimization 10102 Applied Mathematics 10103 Numerical and Computational Mathematics 10802 Computation Theory and Mathematics 10906 Electrical and Electronic Engineering 11702 Cognitive Science 1Bundle methods 1Classification 1Classification (of information) 1DC optimization 1Derivative-free methods 1Discrete gradient method 1Errors 1Gradient methods 1Incremental approach 1Limited memory methods
Show More
Show Less
  • Title
  • Creator
  • Date

An algorithm for clustering using L1-norm based on hyperbolic smoothing technique

- Bagirov, Adil, Mohebi, Ehsan

  • Authors: Bagirov, Adil , Mohebi, Ehsan
  • Date: 2016
  • Type: Text , Journal article
  • Relation: Computational Intelligence Vol. 32, no. 3 (2016), p. 439-457
  • Relation: http://purl.org/au-research/grants/arc/DP140103213
  • Full Text: false
  • Reviewed:
  • Description: Cluster analysis deals with the problem of organization of a collection of objects into clusters based on a similarity measure, which can be defined using various distance functions. The use of different similarity measures allows one to find different cluster structures in a data set. In this article, an algorithm is developed to solve clustering problems where the similarity measure is defined using the L1-norm. The algorithm is designed using the nonsmooth optimization approach to the clustering problem. Smoothing techniques are applied to smooth both the clustering function and the L1-norm. The algorithm computes clusters sequentially and finds global or near global solutions to the clustering problem. Results of numerical experiments using 12 real-world data sets are reported, and the proposed algorithm is compared with two other clustering algorithms. ©2015 Wiley Periodicals, Inc.

An incremental piecewise linear classifier based on polyhedral conic separation

- Ozturk, Gurkan, Bagirov, Adil, Kasimbeyli, Refail

  • Authors: Ozturk, Gurkan , Bagirov, Adil , Kasimbeyli, Refail
  • Date: 2015
  • Type: Text , Journal article
  • Relation: Machine Learning Vol. 101, no. 1-3 (2015), p. 397-413
  • Relation: http://purl.org/au-research/grants/arc/DP140103213
  • Full Text: false
  • Reviewed:
  • Description: In this paper, a piecewise linear classifier based on polyhedral conic separation is developed. This classifier builds nonlinear boundaries between classes using polyhedral conic functions. Since the number of polyhedral conic functions separating classes is not known a priori, an incremental approach is proposed to build separating functions. These functions are found by minimizing an error function which is nonsmooth and nonconvex. A special procedure is proposed to generate starting points to minimize the error function and this procedure is based on the incremental approach. The discrete gradient method, which is a derivative-free method for nonsmooth optimization, is applied to minimize the error function starting from those points. The proposed classifier is applied to solve classification problems on 12 publicly available data sets and compared with some mainstream and piecewise linear classifiers. © 2014, The Author(s).

Clustering in large data sets with the limited memory bundle method

- Karmitsa, Napsu, Bagirov, Adil, Taheri, Sona

  • Authors: Karmitsa, Napsu , Bagirov, Adil , Taheri, Sona
  • Date: 2018
  • Type: Text , Journal article
  • Relation: Pattern Recognition Vol. 83, no. (2018), p. 245-259
  • Relation: http://purl.org/au-research/grants/arc/DP140103213
  • Full Text: false
  • Reviewed:
  • Description: The aim of this paper is to design an algorithm based on nonsmooth optimization techniques to solve the minimum sum-of-squares clustering problems in very large data sets. First, the clustering problem is formulated as a nonsmooth optimization problem. Then the limited memory bundle method [Haarala et al., 2007] is modified and combined with an incremental approach to design a new clustering algorithm. The algorithm is evaluated using real world data sets with both the large number of attributes and the large number of data points. It is also compared with some other optimization based clustering algorithms. The numerical results demonstrate the efficiency of the proposed algorithm for clustering in very large data sets.

A difference of convex optimization algorithm for piecewise linear regression

- Bagirov, Adil, Taheri, Sona, Asadi, Soodabeh

  • Authors: Bagirov, Adil , Taheri, Sona , Asadi, Soodabeh
  • Date: 2019
  • Type: Text , Journal article
  • Relation: Journal of Industrial and Management Optimization Vol. 15, no. 2 (2019), p. 909-932
  • Relation: http://purl.org/au-research/grants/arc/DP140103213
  • Full Text: false
  • Reviewed:
  • Description: The problem of finding a continuous piecewise linear function approximating a regression function is considered. This problem is formulated as a nonconvex nonsmooth optimization problem where the objective function is represented as a difference of convex (DC) functions. Subdifferentials of DC components are computed and an algorithm is designed based on these subdifferentials to find piecewise linear functions. The algorithm is tested using some synthetic and real world data sets and compared with other regression algorithms.

  • «
  • ‹
  • 1
  • ›
  • »
  • English (United States)
  • English (United States)
  • Privacy
  • Copyright
  • Contact
  • Federation Library
  • Federation ResearchOnline policy
  • ABN 51 818 692 256 | CRICOS provider number 00103D | RTO code 4909 | TEQSA PRV12151 Australian University
  • About Vital

‹ › ×

    Clear Session

    Are you sure you would like to clear your session, including search history and login status?