Solving DC programs using the cutting angle method
- Authors: Ferrer, Albert , Bagirov, Adil , Beliakov, Gleb
- Date: 2015
- Type: Text , Journal article
- Relation: Journal of Global Optimization Vol. 61, no. 1 (2015), p. 71-89
- Relation: http://purl.org/au-research/grants/arc/DP140103213
- Full Text: false
- Reviewed:
- Description: In this paper, we propose a new algorithm for global minimization of functions represented as a difference of two convex functions. The proposed method is a derivative free method and it is designed by adapting the extended cutting angle method. We present preliminary results of numerical experiments using test problems with difference of convex objective functions and box-constraints. We also compare the proposed algorithm with a classical one that uses prismatical subdivisions.
Global optimization : Cutting angle method
- Authors: Bagirov, Adil , Beliakov, Gleb
- Date: 2009
- Type: Text , Book chapter
- Relation: Encyclopedia of Optimization Chapter p. 1304-1311
- Full Text: false
Special issue of Optimization, dedicated to the development of the ideas of the late Prof. A. Rubinov
- Authors: Bagirov, Adil , Beliakov, Gleb
- Date: 2009
- Type: Text , Journal article
- Relation: Optimization Vol. 58, no. 5 (2009), p. 479-481
- Full Text: false
- Reviewed:
Non-smooth optimization methods for computation of the conditional value-at-risk and portfolio optimization
- Authors: Beliakov, Gleb , Bagirov, Adil
- Date: 2006
- Type: Text , Journal article
- Relation: Optimization Vol. 55, no. 5-6 (2006), p. 459-479
- Full Text:
- Reviewed:
- Description: We examine numerical performance of various methods of calculation of the Conditional Value-at-risk (CVaR), and portfolio optimization with respect to this risk measure. We concentrate on the method proposed by Rockafellar and Uryasev in (Rockafellar, R.T. and Uryasev, S., 2000, Optimization of conditional value-at-risk. Journal of Risk, 2, 21-41), which converts this problem to that of convex optimization. We compare the use of linear programming techniques against a non-smooth optimization method of the discrete gradient, and establish the supremacy of the latter. We show that non-smooth optimization can be used efficiently for large portfolio optimization, and also examine parallel execution of this method on computer clusters. © 2006 Taylor & Francis.
- Description: C1
- Description: 2003002156
Parallelization of the discrete gradient method of non-smooth optimization and its applications
- Authors: Beliakov, Gleb , Tobon, Monsalve , Bagirov, Adil
- Date: 2003
- Type: Text , Conference paper
- Relation: Paper presented at Computational Science ICCS 2003 Conference, Melbourne : 2nd June, 2003
- Full Text: false
- Reviewed:
- Description: We investigate parallelization and performance of the discrete gradient method of nonsmooth optimization. This derivative free method is shown to be an effective optimization tool, able to skip many shallow local minima of nonconvex nondifferentiable objective functions. Although this is a sequential iterative method, we were able to parallelize critical steps of the algorithm, and this lead to a significant improvement in performance on multiprocessor computer clusters. We applied this method to a difficult polyatomic clusters problem in computational chemistry, and found this method to outperform other algorithms.
- Description: E1
- Description: 2003000435