Your selections:

4Nonsmooth optimization
30102 Applied Mathematics
30103 Numerical and Computational Mathematics
30801 Artificial Intelligence and Image Processing
3Nonconvex optimization
2Cluster analysis
10806 Information Systems
10899 Other Information and Computing Sciences
11702 Cognitive Science
1Bundle method
1Bundle methods
1Clusterwise linear regression
1Codifferential
1DC programming
1Discrete gradient method
1GL Tham
1Greedy algorithm
1Incremental clustering algorithms
1L-2-boosting

Show More

Show Less

Format Type

Aggregate codifferential method for nonsmooth DC optimization

- Tor, Ali, Bagirov, Adil, Karasozen, Bulent

**Authors:**Tor, Ali , Bagirov, Adil , Karasozen, Bulent**Date:**2014**Type:**Text , Journal article**Relation:**Journal of Computational and Applied Mathematics Vol. 259, no. Part B (2014), p. 851-867**Full Text:**false**Reviewed:****Description:**A new algorithm is developed based on the concept of codifferential for minimizing the difference of convex nonsmooth functions. Since the computation of the whole codifferential is not always possible, we use a fixed number of elements from the codifferential to compute the search directions. The convergence of the proposed algorithm is proved. The efficiency of the algorithm is demonstrated by comparing it with the subgradient, the truncated codifferential and the proximal bundle methods using nonsmooth optimization test problems.

An L-2-Boosting Algorithm for Estimation of a Regression Function

- Bagirov, Adil, Clausen, Conny, Kohler, Michael

**Authors:**Bagirov, Adil , Clausen, Conny , Kohler, Michael**Date:**2010**Type:**Text , Journal article**Relation:**IEEE Transactions on Information Theory Vol. 56, no. 3 (2010), p. 1417-1429**Full Text:****Reviewed:****Description:**An L-2-boosting algorithm for estimation of a regression function from random design is presented, which consists of fitting repeatedly a function from a fixed nonlinear function space to the residuals of the data by least squares and by defining the estimate as a linear combination of the resulting least squares estimates. Splitting of the sample is used to decide after how many iterations of smoothing of the residuals the algorithm terminates. The rate of convergence of the algorithm is analyzed in case of an unbounded response variable. The method is used to fit a sum of maxima of minima of linear functions to a given data set, and is compared with other nonparametric regression estimates using simulated data.

**Authors:**Bagirov, Adil , Clausen, Conny , Kohler, Michael**Date:**2010**Type:**Text , Journal article**Relation:**IEEE Transactions on Information Theory Vol. 56, no. 3 (2010), p. 1417-1429**Full Text:****Reviewed:****Description:**An L-2-boosting algorithm for estimation of a regression function from random design is presented, which consists of fitting repeatedly a function from a fixed nonlinear function space to the residuals of the data by least squares and by defining the estimate as a linear combination of the resulting least squares estimates. Splitting of the sample is used to decide after how many iterations of smoothing of the residuals the algorithm terminates. The rate of convergence of the algorithm is analyzed in case of an unbounded response variable. The method is used to fit a sum of maxima of minima of linear functions to a given data set, and is compared with other nonparametric regression estimates using simulated data.

Clustering in large data sets with the limited memory bundle method

- Karmitsa, Napsu, Bagirov, Adil, Taheri, Sona

**Authors:**Karmitsa, Napsu , Bagirov, Adil , Taheri, Sona**Date:**2018**Type:**Text , Journal article**Relation:**Pattern Recognition Vol. 83, no. (2018), p. 245-259**Relation:**http://purl.org/au-research/grants/arc/DP140103213**Full Text:**false**Reviewed:****Description:**The aim of this paper is to design an algorithm based on nonsmooth optimization techniques to solve the minimum sum-of-squares clustering problems in very large data sets. First, the clustering problem is formulated as a nonsmooth optimization problem. Then the limited memory bundle method [Haarala et al., 2007] is modified and combined with an incremental approach to design a new clustering algorithm. The algorithm is evaluated using real world data sets with both the large number of attributes and the large number of data points. It is also compared with some other optimization based clustering algorithms. The numerical results demonstrate the efficiency of the proposed algorithm for clustering in very large data sets.

Nonsmooth DC programming approach to the minimum sum-of-squares clustering problems

- Bagirov, Adil, Taheri, Sona, Ugon, Julien

**Authors:**Bagirov, Adil , Taheri, Sona , Ugon, Julien**Date:**2016**Type:**Text , Journal article**Relation:**Pattern Recognition Vol. 53, no. (2016), p. 12-24**Relation:**http://purl.org/au-research/grants/arc/DP140103213**Full Text:**false**Reviewed:****Description:**This paper introduces an algorithm for solving the minimum sum-of-squares clustering problems using their difference of convex representations. A non-smooth non-convex optimization formulation of the clustering problem is used to design the algorithm. Characterizations of critical points, stationary points in the sense of generalized gradients and inf-stationary points of the clustering problem are given. The proposed algorithm is tested and compared with other clustering algorithms using large real world data sets. © 2015 Elsevier Ltd. All rights reserved.

Nonsmooth optimization algorithm for solving clusterwise linear regression problems

- Bagirov, Adil, Ugon, Julien, Mirzayeva, Hijran

**Authors:**Bagirov, Adil , Ugon, Julien , Mirzayeva, Hijran**Date:**2015**Type:**Text , Journal article**Relation:**Journal of Optimization Theory and Applications Vol. 164, no. 3 (2015), p. 755-780**Relation:**http://purl.org/au-research/grants/arc/DP140103213**Full Text:**false**Reviewed:****Description:**Clusterwise linear regression consists of finding a number of linear regression functions each approximating a subset of the data. In this paper, the clusterwise linear regression problem is formulated as a nonsmooth nonconvex optimization problem and an algorithm based on an incremental approach and on the discrete gradient method of nonsmooth optimization is designed to solve it. This algorithm incrementally divides the whole dataset into groups which can be easily approximated by one linear regression function. A special procedure is introduced to generate good starting points for solving global optimization problems at each iteration of the incremental algorithm. The algorithm is compared with the multi-start Spath and the incremental algorithms on several publicly available datasets for regression analysis.

Subgradient Method for Nonconvex Nonsmooth Optimization

- Bagirov, Adil, Jin, L., Karmitsa, Napsu, Al Nuaimat, A., Sultanova, Nargiz

**Authors:**Bagirov, Adil , Jin, L. , Karmitsa, Napsu , Al Nuaimat, A. , Sultanova, Nargiz**Date:**2012**Type:**Text , Journal article**Relation:**Journal of Optimization Theory and Applications Vol.157, no.2 (2012), p.416–435**Full Text:**false**Reviewed:****Description:**In this paper, we introduce a new method for solving nonconvex nonsmooth optimization problems. It uses quasisecants, which are subgradients computed in some neighborhood of a point. The proposed method contains simple procedures for finding descent directions and for solving line search subproblems. The convergence of the method is studied and preliminary results of numerical experiments are presented. The comparison of the proposed method with the subgradient and the proximal bundle methods is demonstrated using results of numerical experiments. © 2012 Springer Science+Business Media, LLC.

- «
- ‹
- 1
- ›
- »

Are you sure you would like to clear your session, including search history and login status?