/

Default Site
  • Change Site
  • Default Site
  • Advanced Search
  • Expert Search
  • Sign In
    • Help
    • Search History
    • Clear Session
  • Browse
    • Entire Repository  
    • Recent Additions
    • Communities & Collections
    • By Title
    • By Creator
    • By Subject
    • By Type
    • Most Accessed Papers
    • Most Accessed Items
    • Most Accessed Authors
  • Quick Collection  
Sign In
  • Help
  • Search History
  • Clear Session

Showing items 1 - 5 of 5

Your selections:

  • Nonsmooth optimization
  • 0906 Electrical and Electronic Engineering
Creator
2Karmitsa, Napsu 2Taheri, Sona 1Al Nuaimat, A. 1Cimen, Emre 1Jin, L. 1Karasozen, Bulent 1Mirzayeva, Hijran 1Sultanova, Nargiz 1Tor, Ali 1Ugon, Julien
Subject
40102 Applied Mathematics 40103 Numerical and Computational Mathematics 4Nonconvex optimization 2Clusterwise linear regression 10801 Artificial Intelligence and Image Processing 10806 Information Systems 1Bundle method 1Bundle methods 1Cluster analysis 1Clusterwise linear regressions 1Codifferential 1DC optimization 1DC programming 1Difference of convex algorithms 1Discrete gradient method 1Functions 1GL Tham 1Incremental approach
Show More
Show Less
Facets
Creator
2Karmitsa, Napsu 2Taheri, Sona 1Al Nuaimat, A. 1Cimen, Emre 1Jin, L. 1Karasozen, Bulent 1Mirzayeva, Hijran 1Sultanova, Nargiz 1Tor, Ali 1Ugon, Julien
Subject
40102 Applied Mathematics 40103 Numerical and Computational Mathematics 4Nonconvex optimization 2Clusterwise linear regression 10801 Artificial Intelligence and Image Processing 10806 Information Systems 1Bundle method 1Bundle methods 1Cluster analysis 1Clusterwise linear regressions 1Codifferential 1DC optimization 1DC programming 1Difference of convex algorithms 1Discrete gradient method 1Functions 1GL Tham 1Incremental approach
Show More
Show Less
  • Title
  • Creator
  • Date

Incremental DC optimization algorithm for large-scale clusterwise linear regression

- Bagirov, Adil, Taheri, Sona, Cimen, Emre

  • Authors: Bagirov, Adil , Taheri, Sona , Cimen, Emre
  • Date: 2021
  • Type: Text , Journal article
  • Relation: Journal of Computational and Applied Mathematics Vol. 389, no. (2021), p. 1-17
  • Relation: https://purl.org/au-research/grants/arc/DP190100580
  • Full Text: false
  • Reviewed:
  • Description: The objective function in the nonsmooth optimization model of the clusterwise linear regression (CLR) problem with the squared regression error is represented as a difference of two convex functions. Then using the difference of convex algorithm (DCA) approach the CLR problem is replaced by the sequence of smooth unconstrained optimization subproblems. A new algorithm based on the DCA and the incremental approach is designed to solve the CLR problem. We apply the Quasi-Newton method to solve the subproblems. The proposed algorithm is evaluated using several synthetic and real-world data sets for regression and compared with other algorithms for CLR. Results demonstrate that the DCA based algorithm is efficient for solving CLR problems with the large number of data points and in particular, outperforms other algorithms when the number of input variables is small. © 2020 Elsevier B.V.

Subgradient Method for Nonconvex Nonsmooth Optimization

- Bagirov, Adil, Jin, L., Karmitsa, Napsu, Al Nuaimat, A., Sultanova, Nargiz

  • Authors: Bagirov, Adil , Jin, L. , Karmitsa, Napsu , Al Nuaimat, A. , Sultanova, Nargiz
  • Date: 2012
  • Type: Text , Journal article
  • Relation: Journal of Optimization Theory and Applications Vol.157, no.2 (2012), p.416–435
  • Full Text: false
  • Reviewed:
  • Description: In this paper, we introduce a new method for solving nonconvex nonsmooth optimization problems. It uses quasisecants, which are subgradients computed in some neighborhood of a point. The proposed method contains simple procedures for finding descent directions and for solving line search subproblems. The convergence of the method is studied and preliminary results of numerical experiments are presented. The comparison of the proposed method with the subgradient and the proximal bundle methods is demonstrated using results of numerical experiments. © 2012 Springer Science+Business Media, LLC.

Clustering in large data sets with the limited memory bundle method

- Karmitsa, Napsu, Bagirov, Adil, Taheri, Sona

  • Authors: Karmitsa, Napsu , Bagirov, Adil , Taheri, Sona
  • Date: 2018
  • Type: Text , Journal article
  • Relation: Pattern Recognition Vol. 83, no. (2018), p. 245-259
  • Relation: http://purl.org/au-research/grants/arc/DP140103213
  • Full Text: false
  • Reviewed:
  • Description: The aim of this paper is to design an algorithm based on nonsmooth optimization techniques to solve the minimum sum-of-squares clustering problems in very large data sets. First, the clustering problem is formulated as a nonsmooth optimization problem. Then the limited memory bundle method [Haarala et al., 2007] is modified and combined with an incremental approach to design a new clustering algorithm. The algorithm is evaluated using real world data sets with both the large number of attributes and the large number of data points. It is also compared with some other optimization based clustering algorithms. The numerical results demonstrate the efficiency of the proposed algorithm for clustering in very large data sets.

Aggregate codifferential method for nonsmooth DC optimization

- Tor, Ali, Bagirov, Adil, Karasozen, Bulent

  • Authors: Tor, Ali , Bagirov, Adil , Karasozen, Bulent
  • Date: 2014
  • Type: Text , Journal article
  • Relation: Journal of Computational and Applied Mathematics Vol. 259, no. Part B (2014), p. 851-867
  • Full Text: false
  • Reviewed:
  • Description: A new algorithm is developed based on the concept of codifferential for minimizing the difference of convex nonsmooth functions. Since the computation of the whole codifferential is not always possible, we use a fixed number of elements from the codifferential to compute the search directions. The convergence of the proposed algorithm is proved. The efficiency of the algorithm is demonstrated by comparing it with the subgradient, the truncated codifferential and the proximal bundle methods using nonsmooth optimization test problems.

Nonsmooth optimization algorithm for solving clusterwise linear regression problems

- Bagirov, Adil, Ugon, Julien, Mirzayeva, Hijran

  • Authors: Bagirov, Adil , Ugon, Julien , Mirzayeva, Hijran
  • Date: 2015
  • Type: Text , Journal article
  • Relation: Journal of Optimization Theory and Applications Vol. 164, no. 3 (2015), p. 755-780
  • Relation: http://purl.org/au-research/grants/arc/DP140103213
  • Full Text: false
  • Reviewed:
  • Description: Clusterwise linear regression consists of finding a number of linear regression functions each approximating a subset of the data. In this paper, the clusterwise linear regression problem is formulated as a nonsmooth nonconvex optimization problem and an algorithm based on an incremental approach and on the discrete gradient method of nonsmooth optimization is designed to solve it. This algorithm incrementally divides the whole dataset into groups which can be easily approximated by one linear regression function. A special procedure is introduced to generate good starting points for solving global optimization problems at each iteration of the incremental algorithm. The algorithm is compared with the multi-start Spath and the incremental algorithms on several publicly available datasets for regression analysis.

  • «
  • ‹
  • 1
  • ›
  • »
  • English (United States)
  • English (United States)
  • Privacy
  • Copyright
  • Contact
  • Federation Library
  • Federation ResearchOnline policy
  • ABN 51 818 692 256 | CRICOS provider number 00103D | RTO code 4909 | TEQSA PRV12151 Australian University
  • About Vital

‹ › ×

    Clear Session

    Are you sure you would like to clear your session, including search history and login status?