Aggregate subgradient method for nonsmooth DC optimization
- Authors: Bagirov, Adil , Taheri, Sona , Joki, Kaisa , Karmitsa, Napsu , Mäkelä, Marko
- Date: 2021
- Type: Text , Journal article
- Relation: Optimization Letters Vol. 15, no. 1 (2021), p. 83-96
- Relation: http://purl.org/au-research/grants/arc/DP190100580
- Full Text:
- Reviewed:
- Description: The aggregate subgradient method is developed for solving unconstrained nonsmooth difference of convex (DC) optimization problems. The proposed method shares some similarities with both the subgradient and the bundle methods. Aggregate subgradients are defined as a convex combination of subgradients computed at null steps between two serious steps. At each iteration search directions are found using only two subgradients: the aggregate subgradient and a subgradient computed at the current null step. It is proved that the proposed method converges to a critical point of the DC optimization problem and also that the number of null steps between two serious steps is finite. The new method is tested using some academic test problems and compared with several other nonsmooth DC optimization solvers. © 2020, Springer-Verlag GmbH Germany, part of Springer Nature.
An augmented subgradient method for minimizing nonsmooth DC functions
- Authors: Bagirov, Adil , Hoseini Monjezi, Najmeh , Taheri, Sona
- Date: 2021
- Type: Text , Journal article
- Relation: Computational Optimization and Applications Vol. 80, no. 2 (2021), p. 411-438
- Relation: http://purl.org/au-research/grants/arc/DP190100580
- Full Text: false
- Reviewed:
- Description: A method, called an augmented subgradient method, is developed to solve unconstrained nonsmooth difference of convex (DC) optimization problems. At each iteration of this method search directions are found by using several subgradients of the first DC component and one subgradient of the second DC component of the objective function. The developed method applies an Armijo-type line search procedure to find the next iteration point. It is proved that the sequence of points generated by the method converges to a critical point of the unconstrained DC optimization problem. The performance of the method is demonstrated using academic test problems with nonsmooth DC objective functions and its performance is compared with that of two general nonsmooth optimization solvers and five solvers specifically designed for unconstrained DC optimization. Computational results show that the developed method is efficient and robust for solving nonsmooth DC optimization problems. © 2021, The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature.
Incremental DC optimization algorithm for large-scale clusterwise linear regression
- Authors: Bagirov, Adil , Taheri, Sona , Cimen, Emre
- Date: 2021
- Type: Text , Journal article
- Relation: Journal of Computational and Applied Mathematics Vol. 389, no. (2021), p. 1-17
- Relation: https://purl.org/au-research/grants/arc/DP190100580
- Full Text: false
- Reviewed:
- Description: The objective function in the nonsmooth optimization model of the clusterwise linear regression (CLR) problem with the squared regression error is represented as a difference of two convex functions. Then using the difference of convex algorithm (DCA) approach the CLR problem is replaced by the sequence of smooth unconstrained optimization subproblems. A new algorithm based on the DCA and the incremental approach is designed to solve the CLR problem. We apply the Quasi-Newton method to solve the subproblems. The proposed algorithm is evaluated using several synthetic and real-world data sets for regression and compared with other algorithms for CLR. Results demonstrate that the DCA based algorithm is efficient for solving CLR problems with the large number of data points and in particular, outperforms other algorithms when the number of input variables is small. © 2020 Elsevier B.V.
An incremental nonsmooth optimization algorithm for clustering using L1 and L∞ norms
- Authors: Ordin, Burak , Bagirov, Adil , Mohebi, Ehsam
- Date: 2020
- Type: Text , Journal article
- Relation: Journal of Industrial and Management Optimization Vol. 16, no. 6 (2020), p. 2757-2779
- Relation: http://purl.org/au-research/grants/arc/DP190100580
- Full Text: false
- Reviewed:
- Description: An algorithm is developed for solving clustering problems with the similarity measure defined using the L1and L∞ norms. It is based on an incremental approach and applies nonsmooth optimization methods to find cluster centers. Computational results on 12 data sets are reported and the proposed algorithm is compared with the X-means algorithm. ©
A difference of convex optimization algorithm for piecewise linear regression
- Authors: Bagirov, Adil , Taheri, Sona , Asadi, Soodabeh
- Date: 2019
- Type: Text , Journal article
- Relation: Journal of Industrial and Management Optimization Vol. 15, no. 2 (2019), p. 909-932
- Relation: http://purl.org/au-research/grants/arc/DP140103213
- Full Text: false
- Reviewed:
- Description: The problem of finding a continuous piecewise linear function approximating a regression function is considered. This problem is formulated as a nonconvex nonsmooth optimization problem where the objective function is represented as a difference of convex (DC) functions. Subdifferentials of DC components are computed and an algorithm is designed based on these subdifferentials to find piecewise linear functions. The algorithm is tested using some synthetic and real world data sets and compared with other regression algorithms.
Double bundle method for finding clarke stationary points in nonsmooth dc programming
- Authors: Joki, Kaisa , Bagirov, Adil , Karmitsa, Napsu , Makela, Marko , Taheri, Sona
- Date: 2018
- Type: Text , Journal article
- Relation: SIAM Journal on Optimization Vol. 28, no. 2 (2018), p. 1892-1919
- Relation: http://purl.org/au-research/grants/arc/DP140103213
- Full Text:
- Reviewed:
- Description: The aim of this paper is to introduce a new proximal double bundle method for unconstrained nonsmooth optimization, where the objective function is presented as a difference of two convex (DC) functions. The novelty in our method is a new escape procedure which enables us to guarantee approximate Clarke stationarity for solutions by utilizing the DC components of the objective function. This optimality condition is stronger than the criticality condition typically used in DC programming. Moreover, if a candidate solution is not approximate Clarke stationary, then the escape procedure returns a descent direction. With this escape procedure, we can avoid some shortcomings encountered when criticality is used. The finite termination of the double bundle method to an approximate Clarke stationary point is proved by assuming that the subdifferentials of DC components are polytopes. Finally, some encouraging numerical results are presented.
Nonsmooth DC programming approach to clusterwise linear regression : Optimality conditions and algorithms
- Authors: Bagirov, Adil , Ugon, Julien
- Date: 2018
- Type: Text , Journal article
- Relation: Optimization Methods and Software Vol. 33, no. 1 (2018), p. 194-219
- Relation: http://purl.org/au-research/grants/arc/DP140103213
- Full Text: false
- Reviewed:
- Description: The clusterwise linear regression problem is formulated as a nonsmooth nonconvex optimization problem using the squared regression error function. The objective function in this problem is represented as a difference of convex functions. Optimality conditions are derived, and an algorithm is designed based on such a representation. An incremental approach is proposed to generate starting solutions. The algorithm is tested on small to large data sets. © 2017 Informa UK Limited, trading as Taylor & Francis Group.
A proximal bundle method for nonsmooth DC optimization utilizing nonconvex cutting planes
- Authors: Joki, Kaisa , Bagirov, Adil , Karmitsa, Napsu , Makela, Marko
- Date: 2017
- Type: Text , Journal article
- Relation: Journal of Global Optimization Vol. 68, no. 3 (2017), p. 501-535
- Relation: http://purl.org/au-research/grants/arc/DP140103213
- Full Text: false
- Reviewed:
- Description: In this paper, we develop a version of the bundle method to solve unconstrained difference of convex (DC) programming problems. It is assumed that a DC representation of the objective function is available. Our main idea is to utilize subgradients of both the first and second components in the DC representation. This subgradient information is gathered from some neighborhood of the current iteration point and it is used to build separately an approximation for each component in the DC representation. By combining these approximations we obtain a new nonconvex cutting plane model of the original objective function, which takes into account explicitly both the convex and the concave behavior of the objective function. We design the proximal bundle method for DC programming based on this new approach and prove the convergence of the method to an -critical point. The algorithm is tested using some academic test problems and the preliminary numerical results have shown the good performance of the new bundle method. An interesting fact is that the new algorithm finds nearly always the global solution in our test problems.
A heuristic algorithm for solving the minimum sum-of-squares clustering problems
- Authors: Ordin, Burak , Bagirov, Adil
- Date: 2015
- Type: Text , Journal article
- Relation: Journal of Global Optimization Vol. 61, no. 2 (2015), p. 341-361
- Relation: http://purl.org/au-research/grants/arc/DP140103213
- Full Text: false
- Reviewed:
- Description: Clustering is an important task in data mining. It can be formulated as a global optimization problem which is challenging for existing global optimization techniques even in medium size data sets. Various heuristics were developed to solve the clustering problem. The global k-means and modified global k-means are among most efficient heuristics for solving the minimum sum-of-squares clustering problem. However, these algorithms are not always accurate in finding global or near global solutions to the clustering problem. In this paper, we introduce a new algorithm to improve the accuracy of the modified global k-means algorithm in finding global solutions. We use an auxiliary cluster problem to generate a set of initial points and apply the k-means algorithm starting from these points to find the global solution to the clustering problems. Numerical results on 16 real-world data sets clearly demonstrate the superiority of the proposed algorithm over the global and modified global k-means algorithms in finding global solutions to clustering problems.
An incremental clustering algorithm based on hyperbolic smoothing
- Authors: Bagirov, Adil , Ordin, Burak , Ozturk, Gurkan , Xavier, Adilson
- Date: 2015
- Type: Text , Journal article
- Relation: Computational Optimization and Applications Vol. 61, no. 1 (2015), p. 219-241
- Relation: http://purl.org/au-research/grants/arc/DP140103213
- Full Text: false
- Reviewed:
- Description: Clustering is an important problem in data mining. It can be formulated as a nonsmooth, nonconvex optimization problem. For the most global optimization techniques this problem is challenging even in medium size data sets. In this paper, we propose an approach that allows one to apply local methods of smooth optimization to solve the clustering problems. We apply an incremental approach to generate starting points for cluster centers which enables us to deal with nonconvexity of the problem. The hyperbolic smoothing technique is applied to handle nonsmoothness of the clustering problems and to make it possible application of smooth optimization algorithms to solve them. Results of numerical experiments with eleven real-world data sets and the comparison with state-of-the-art incremental clustering algorithms demonstrate that the smooth optimization algorithms in combination with the incremental approach are powerful alternative to existing clustering algorithms.
Nonsmooth optimization algorithm for solving clusterwise linear regression problems
- Authors: Bagirov, Adil , Ugon, Julien , Mirzayeva, Hijran
- Date: 2015
- Type: Text , Journal article
- Relation: Journal of Optimization Theory and Applications Vol. 164, no. 3 (2015), p. 755-780
- Relation: http://purl.org/au-research/grants/arc/DP140103213
- Full Text: false
- Reviewed:
- Description: Clusterwise linear regression consists of finding a number of linear regression functions each approximating a subset of the data. In this paper, the clusterwise linear regression problem is formulated as a nonsmooth nonconvex optimization problem and an algorithm based on an incremental approach and on the discrete gradient method of nonsmooth optimization is designed to solve it. This algorithm incrementally divides the whole dataset into groups which can be easily approximated by one linear regression function. A special procedure is introduced to generate good starting points for solving global optimization problems at each iteration of the incremental algorithm. The algorithm is compared with the multi-start Spath and the incremental algorithms on several publicly available datasets for regression analysis.
Aggregate codifferential method for nonsmooth DC optimization
- Authors: Tor, Ali , Bagirov, Adil , Karasozen, Bulent
- Date: 2014
- Type: Text , Journal article
- Relation: Journal of Computational and Applied Mathematics Vol. 259, no. Part B (2014), p. 851-867
- Full Text: false
- Reviewed:
- Description: A new algorithm is developed based on the concept of codifferential for minimizing the difference of convex nonsmooth functions. Since the computation of the whole codifferential is not always possible, we use a fixed number of elements from the codifferential to compute the search directions. The convergence of the proposed algorithm is proved. The efficiency of the algorithm is demonstrated by comparing it with the subgradient, the truncated codifferential and the proximal bundle methods using nonsmooth optimization test problems.
An algorithm for clusterwise linear regression based on smoothing techniques
- Authors: Bagirov, Adil , Ugon, Julien , Mirzayeva, Hijran
- Date: 2014
- Type: Text , Journal article
- Relation: Optimization Letters Vol. 9, no. 2 (2014), p. 375-390
- Full Text: false
- Reviewed:
- Description: We propose an algorithm based on an incremental approach and smoothing techniques to solve clusterwise linear regression (CLR) problems. This algorithm incrementally divides the whole data set into groups which can be easily approximated by one linear regression function. A special procedure is introduced to generate an initial solution for solving global optimization problems at each iteration of the incremental algorithm. Such an approach allows one to find global or approximate global solutions to the CLR problems. The algorithm is tested using several data sets for regression analysis and compared with the multistart and incremental Spath algorithms.
Hyperbolic smoothing function method for minimax problems
- Authors: Bagirov, Adil , Al Nuaimat, Alia , Sultanova, Nargiz
- Date: 2013
- Type: Text , Journal article
- Relation: Optimization Vol. 62, no. 6 (2013), p. 759-782
- Full Text: false
- Reviewed:
- Description: In this article, an approach for solving finite minimax problems is proposed. This approach is based on the use of hyperbolic smoothing functions. In order to apply the hyperbolic smoothing we reformulate the objective function in the minimax problem and study the relationship between the original minimax and reformulated problems. We also study main properties of the hyperbolic smoothing function. Based on these results an algorithm for solving the finite minimax problem is proposed and this algorithm is implemented in general algebraic modelling system. We present preliminary results of numerical experiments with well-known nonsmooth optimization test problems. We also compare the proposed algorithm with the algorithm that uses the exponential smoothing function as well as with the algorithm based on nonlinear programming reformulation of the finite minimax problem. © 2013 Copyright Taylor and Francis Group, LLC.
- Description: 2003011099
Subgradient Method for Nonconvex Nonsmooth Optimization
- Authors: Bagirov, Adil , Jin, L. , Karmitsa, Napsu , Al Nuaimat, A. , Sultanova, Nargiz
- Date: 2012
- Type: Text , Journal article
- Relation: Journal of Optimization Theory and Applications Vol.157, no.2 (2012), p.416–435
- Full Text: false
- Reviewed:
- Description: In this paper, we introduce a new method for solving nonconvex nonsmooth optimization problems. It uses quasisecants, which are subgradients computed in some neighborhood of a point. The proposed method contains simple procedures for finding descent directions and for solving line search subproblems. The convergence of the method is studied and preliminary results of numerical experiments are presented. The comparison of the proposed method with the subgradient and the proximal bundle methods is demonstrated using results of numerical experiments. © 2012 Springer Science+Business Media, LLC.
Codifferential method for minimizing nonsmooth DC functions
- Authors: Bagirov, Adil , Ugon, Julien
- Date: 2011
- Type: Text , Journal article
- Relation: Journal of Global Optimization Vol. 50, no. 1 (2011), p. 3-22
- Relation: http://purl.org/au-research/grants/arc/DP0666061
- Full Text: false
- Reviewed:
- Description: In this paper, a new algorithm to locally minimize nonsmooth functions represented as a difference of two convex functions (DC functions) is proposed. The algorithm is based on the concept of codifferential. It is assumed that DC decomposition of the objective function is known a priori. We develop an algorithm to compute descent directions using a few elements from codifferential. The convergence of the minimization algorithm is studied and its comparison with different versions of the bundle methods using results of numerical experiments is given. © 2010 Springer Science+Business Media, LLC.
A necessary optimality condition for free knots linear splines: Special cases
- Authors: Sukhorukova, Nadezda
- Date: 2010
- Type: Text , Journal article
- Relation: Pacific Journal of Optimization Vol. 6, no. 2, Suppl. 1 (2010), p. 305-317
- Full Text: false
- Description: In this paper, we study the problem of best Chebyshev approximation by linear splines. We construct linear splines as a max - min of linear functions. Then we apply nonsmooth optimisation techniques to analyse and solve the corresponding optimisation problems. This approach allows us to identify and introduce a new important property of linear spline knots (regular and irregular). Using this property, we derive a necessary optimality condition for the case of regular knots. This condition is stronger than those existing in the literature. We also present a numerical example which demonstrates the difference between the old and the new optimality conditions.
A quasisecant method for minimizing nonsmooth functions
- Authors: Bagirov, Adil , Ganjehlou, Asef Nazari
- Date: 2010
- Type: Text , Journal article
- Relation: Optimization Methods and Software Vol. 25, no. 1 (2010), p. 3-18
- Relation: http://purl.org/au-research/grants/arc/DP0666061
- Full Text: false
- Reviewed:
- Description: We present an algorithm to locally minimize nonsmooth, nonconvex functions. In order to find descent directions, the notion of quasisecants, introduced in this paper, is applied. We prove that the algorithm converges to Clarke stationary points. Numerical results are presented demonstrating the applicability of the proposed algorithm to a wide variety of nonsmooth, nonconvex optimization problems. We also compare the proposed algorithm with the bundle method using numerical results.
Vallee poussin theorem and remez algorithm in the case of generalised degree polynomial spline approximation
- Authors: Sukhorukova, Nadezda
- Date: 2010
- Type: Text , Journal article
- Relation: Pacific Journal of Optimization Vol. 6, no. 1 (2010), p. 103-114
- Full Text: false
- Description: The classical Remez algorithm was developed for constructing the best polynomial approximations for continuous and discrete functions in an interval. In this paper the classical Remez algorithm is generalised to the problem of polynomial spline (piece-wise polynomial) approximation with the spline defect equal to the spline degree. Also, the values of the splines in the end points of the approximation interval may be fixed Polynomial splines combine simplicity of polynomials and flexibility, which allows one to significantly decrease the degree of the corresponding polynomials and oscillations of deviation functions. Therefore polynomial splines are a powerful tool for function and data approximation. The generalisation of the Remez algorithm developed in this research has been tested on several approximation problems. The results of the numerical experiments are presented.
Discrete gradient method : Derivative-free method for nonsmooth optimization
- Authors: Bagirov, Adil , Karasozen, Bulent , Sezer, Monsalve
- Date: 2008
- Type: Text , Journal article
- Relation: Journal of Optimization Theory and Applications Vol. 137, no. 2 (2008), p. 317-334
- Relation: http://purl.org/au-research/grants/arc/DP0666061
- Full Text:
- Reviewed:
- Description: A new derivative-free method is developed for solving unconstrained nonsmooth optimization problems. This method is based on the notion of a discrete gradient. It is demonstrated that the discrete gradients can be used to approximate subgradients of a broad class of nonsmooth functions. It is also shown that the discrete gradients can be applied to find descent directions of nonsmooth functions. The preliminary results of numerical experiments with unconstrained nonsmooth optimization problems as well as the comparison of the proposed method with the nonsmooth optimization solver DNLP from CONOPT-GAMS and the derivative-free optimization solver CONDOR are presented. © 2007 Springer Science+Business Media, LLC.
- Description: C1