Implementation of novel methods of global and nonsmooth optimization : GANSO programming library
- Authors: Beliakov, Gleb , Ugon, Julien
- Date: 2007
- Type: Text , Journal article
- Relation: Optimization Vol. 56, no. 5-6 (2007), p. 543-546
- Full Text:
- Reviewed:
- Description: We discuss the implementation of a number of modern methods of global and nonsmooth continuous optimization, based on the ideas of Rubinov, in a programming library GANSO. GANSO implements the derivative-free bundle method, the extended cutting angle method, dynamical system-based optimization and their various combinations and heuristics. We outline the main ideas behind each method, and report on the interfacing with Matlab and Maple packages.
- Description: C1
- Description: 2003004865
Optimization solvers and problem formulations for solving data clustering problems
- Authors: Ugon, Julien
- Date: 2007
- Type: Text , Journal article
- Relation: Pacific Journal of Optimization Vol. 3, no. 2 (2007), p. 387-397
- Full Text: false
- Reviewed:
- Description: A popular apprach for solving complex optimization problems is through relaxation: some constraints are removed in order to have a convex problem approximating the original problem. On the other hand, direct approaches for solving such problems are becoming increasingly powerful. This paper examines two cases drawn from data analysis, in order to compare the two techniques.
- Description: C1
- Description: 2003004937
Codifferential method for minimizing nonsmooth DC functions
- Authors: Bagirov, Adil , Ugon, Julien
- Date: 2011
- Type: Text , Journal article
- Relation: Journal of Global Optimization Vol. 50, no. 1 (2011), p. 3-22
- Relation: http://purl.org/au-research/grants/arc/DP0666061
- Full Text: false
- Reviewed:
- Description: In this paper, a new algorithm to locally minimize nonsmooth functions represented as a difference of two convex functions (DC functions) is proposed. The algorithm is based on the concept of codifferential. It is assumed that DC decomposition of the objective function is known a priori. We develop an algorithm to compute descent directions using a few elements from codifferential. The convergence of the minimization algorithm is studied and its comparison with different versions of the bundle methods using results of numerical experiments is given. © 2010 Springer Science+Business Media, LLC.
An algorithm for clusterwise linear regression based on smoothing techniques
- Authors: Bagirov, Adil , Ugon, Julien , Mirzayeva, Hijran
- Date: 2014
- Type: Text , Journal article
- Relation: Optimization Letters Vol. 9, no. 2 (2014), p. 375-390
- Full Text: false
- Reviewed:
- Description: We propose an algorithm based on an incremental approach and smoothing techniques to solve clusterwise linear regression (CLR) problems. This algorithm incrementally divides the whole data set into groups which can be easily approximated by one linear regression function. A special procedure is introduced to generate an initial solution for solving global optimization problems at each iteration of the incremental algorithm. Such an approach allows one to find global or approximate global solutions to the CLR problems. The algorithm is tested using several data sets for regression analysis and compared with the multistart and incremental Spath algorithms.
Characterization theorem for best polynomial spline approximation with free knots, variable degree and fixed tails
- Authors: Crouzeix, Jean-Pierre , Sukhorukova, Nadezda , Ugon, Julien
- Date: 2017
- Type: Text , Journal article
- Relation: Journal of Optimization Theory and Applications Vol. 172, no. 3 (2017), p. 950-964
- Full Text:
- Reviewed:
- Description: In this paper, we derive a necessary condition for a best approximation by piecewise polynomial functions of varying degree from one interval to another. Based on these results, we obtain a characterization theorem for the polynomial splines with fixed tails, that is the value of the spline is fixed in one or more knots (external or internal). We apply nonsmooth nonconvex analysis to obtain this result, which is also a necessary and sufficient condition for inf-stationarity in the sense of Demyanov-Rubinov. This paper is an extension of a paper where similar conditions were obtained for free tails splines. The main results of this paper are essential for the development of a Remez-type algorithm for free knot spline approximation.
Global optimality conditions and optimization methods for polynomial programming problems
- Authors: Wu, Zhiyou , Tian, Jing , Ugon, Julien
- Date: 2015
- Type: Text , Journal article
- Relation: Journal of Global Optimization Vol. 62, no. 4 (2015), p. 617-641
- Full Text: false
- Reviewed:
- Description: This paper is concerned with the general polynomial programming problem with box constraints, including global optimality conditions and optimization methods. First, a necessary global optimality condition for a general polynomial programming problem with box constraints is given. Then we design a local optimization method by using the necessary global optimality condition to obtain some strongly or -strongly local minimizers which substantially improve some KKT points. Finally, a global optimization method, by combining the new local optimization method and an auxiliary function, is designed. Numerical examples show that our methods are efficient and stable.
On the reconstruction of polytopes
- Authors: Doolittle, Joseph , Nevo, Eran , Pineda-Villavicencio, Guillermo , Ugon, Julien , Yost, David
- Date: 2019
- Type: Text , Journal article
- Relation: Discrete and Computational Geometry Vol. 61, no. 2 (2019), p. 285-302. http://purl.org/au-research/grants/arc/DP180100602
- Full Text:
- Reviewed:
- Description: Blind and Mani, and later Kalai, showed that the face lattice of a simple polytope is determined by its graph, namely its 1-skeleton. Call a vertex of a d-polytope nonsimple if the number of edges incident to it is more than d. We show that (1) the face lattice of any d-polytope with at most two nonsimple vertices is determined by its 1-skeleton; (2) the face lattice of any d-polytope with at most d- 2 nonsimple vertices is determined by its 2-skeleton; and (3) for any d> 3 there are two d-polytopes with d- 1 nonsimple vertices, isomorphic (d- 3) -skeleta and nonisomorphic face lattices. In particular, the result (1) is best possible for 4-polytopes. © 2018, Springer Science+Business Media, LLC, part of Springer Nature.
Global optimality conditions and optimization methods for constrained polynomial programming problems
- Authors: Wu, Zhiyou , Tian, Jing , Ugon, Julien , Zhang, Liang
- Date: 2015
- Type: Text , Journal article
- Relation: Applied Mathematics and Computation Vol. 262, no. (2015), p. 312-325
- Full Text:
- Reviewed:
- Description: The general constrained polynomial programming problem (GPP) is considered in this paper. Problem (GPP) has a broad range of applications and is proved to be NP-hard. Necessary global optimality conditions for problem (GPP) are established. Then, a new local optimization method for this problem is proposed by exploiting these necessary global optimality conditions. A global optimization method is proposed for this problem by combining this local optimization method together with an auxiliary function. Some numerical examples are also given to illustrate that these approaches are very efficient. (C) 2015 Elsevier Inc. All rights reserved.
Optimality conditions and optimization methods for quartic polynomial optimization
- Authors: Wu, Zhiyou , Tian, Jing , Quan, Jing , Ugon, Julien
- Date: 2014
- Type: Text , Journal article
- Relation: Applied Mathematics and Computation Vol. 232, no. (2014), p. 968-982
- Full Text: false
- Reviewed:
- Description: In this paper multivariate quartic polynomial optimization program (QPOP) is considered. Quartic optimization problems arise in various practical applications and are proved to be NP hard. We discuss necessary global optimality conditions for quartic problem (QPOP). And then we present a new (strongly or ε-strongly) local optimization method according to necessary global optimality conditions, which may escape and improve some KKT points. Finally we design a global optimization method for problem (QPOP) by combining the new (strongly or ε-strongly) local optimization method and an auxiliary function. Numerical examples show that our algorithms are efficient and stable.
Chebyshev approximation by linear combinations of fixed knot polynomial splines with weighting functions
- Authors: Sukhorukova, Nadezda , Ugon, Julien
- Date: 2016
- Type: Text , Journal article
- Relation: Journal of Optimization Theory and Applications Vol. 171, no. 2 (2016), p. 536-549
- Full Text: false
- Reviewed:
- Description: In this paper, we derive conditions for best uniform approximation by fixed knots polynomial splines with weighting functions. The theory of Chebyshev approximation for fixed knots polynomial functions is very elegant and complete. Necessary and sufficient optimality conditions have been developed leading to efficient algorithms for constructing optimal spline approximations. The optimality conditions are based on the notion of alternance (maximal deviation points with alternating deviation signs). In this paper, we extend these results to the case when the model function is a product of fixed knots polynomial splines (whose parameters are subject to optimization) and other functions (whose parameters are predefined). This problem is nonsmooth, and therefore, we make use of convex and nonsmooth analysis to solve it.
Nonsmooth DC programming approach to clusterwise linear regression : Optimality conditions and algorithms
- Authors: Bagirov, Adil , Ugon, Julien
- Date: 2018
- Type: Text , Journal article
- Relation: Optimization Methods and Software Vol. 33, no. 1 (2018), p. 194-219
- Relation: http://purl.org/au-research/grants/arc/DP140103213
- Full Text: false
- Reviewed:
- Description: The clusterwise linear regression problem is formulated as a nonsmooth nonconvex optimization problem using the squared regression error function. The objective function in this problem is represented as a difference of convex functions. Optimality conditions are derived, and an algorithm is designed based on such a representation. An incremental approach is proposed to generate starting solutions. The algorithm is tested on small to large data sets. © 2017 Informa UK Limited, trading as Taylor & Francis Group.
Nonsmooth optimization algorithm for solving clusterwise linear regression problems
- Authors: Bagirov, Adil , Ugon, Julien , Mirzayeva, Hijran
- Date: 2015
- Type: Text , Journal article
- Relation: Journal of Optimization Theory and Applications Vol. 164, no. 3 (2015), p. 755-780
- Relation: http://purl.org/au-research/grants/arc/DP140103213
- Full Text: false
- Reviewed:
- Description: Clusterwise linear regression consists of finding a number of linear regression functions each approximating a subset of the data. In this paper, the clusterwise linear regression problem is formulated as a nonsmooth nonconvex optimization problem and an algorithm based on an incremental approach and on the discrete gradient method of nonsmooth optimization is designed to solve it. This algorithm incrementally divides the whole dataset into groups which can be easily approximated by one linear regression function. A special procedure is introduced to generate good starting points for solving global optimization problems at each iteration of the incremental algorithm. The algorithm is compared with the multi-start Spath and the incremental algorithms on several publicly available datasets for regression analysis.
Chebyshev multivariate polynomial approximation and point reduction procedure
- Authors: Sukhorukova, Nadezda , Ugon, Julien , Yost, David
- Date: 2021
- Type: Text , Journal article
- Relation: Constructive Approximation Vol. 53, no. 3 (2021), p. 529-544
- Relation: http://purl.org/au-research/grants/arc/DP180100602
- Full Text:
- Reviewed:
- Description: We apply the methods of nonsmooth and convex analysis to extend the study of Chebyshev (uniform) approximation for univariate polynomial functions to the case of general multivariate functions (not just polynomials). First of all, we give new necessary and sufficient optimality conditions for multivariate approximation, and a geometrical interpretation of them which reduces to the classical alternating sequence condition in the univariate case. Then, we present a procedure for verification of necessary and sufficient optimality conditions that is based on our generalization of the notion of alternating sequence to the case of multivariate polynomials. Finally, we develop an algorithm for fast verification of necessary optimality conditions in the multivariate polynomial case. © 2019, Springer Science+Business Media, LLC, part of Springer Nature.
Polytopes close to being simple
- Authors: Pineda-Villavicencio, Guillermo , Ugon, Julien , Yost, David
- Date: 2020
- Type: Text , Journal article
- Relation: Discrete and Computational Geometry Vol. 64, no. 1 (2020), p. 200-215
- Relation: http://purl.org/au-research/grants/arc/DP180100602
- Full Text:
- Reviewed:
- Description: It is known that polytopes with at most two nonsimple vertices are reconstructible from their graphs, and that d-polytopes with at most d- 2 nonsimple vertices are reconstructible from their 2-skeletons. Here we close the gap between 2 and d- 2 , showing that certain polytopes with more than two nonsimple vertices are reconstructible from their graphs. In particular, we prove that reconstructibility from graphs also holds for d-polytopes with d+ k vertices and at most d- k+ 3 nonsimple vertices, provided k
Generalised rational approximation and its application to improve deep learning classifiers
- Authors: Peiris, V , Sharon, Nir , Sukhorukova, Nadezda , Ugon, Julien
- Date: 2021
- Type: Text , Journal article
- Relation: Applied Mathematics and Computation Vol. 389, no. (2021), p.
- Relation: https://purl.org/au-research/grants/arc/DP180100602
- Full Text: false
- Reviewed:
- Description: A rational approximation (that is, approximation by a ratio of two polynomials) is a flexible alternative to polynomial approximation. In particular, rational functions exhibit accurate estimations to nonsmooth and non-Lipschitz functions, where polynomial approximations are not efficient. We prove that the optimisation problems appearing in the best uniform rational approximation and its generalisation to a ratio of linear combinations of basis functions are quasiconvex even when the basis functions are not restricted to monomials. Then we show how this fact can be used in the development of computational methods. This paper presents a theoretical study of the arising optimisation problems and provides results of several numerical experiments. We apply our approximation as a preprocessing step to deep learning classifiers and demonstrate that the classification accuracy is significantly improved compared to the classification of the raw signals. © 2020
- Description: This research was supported by the Australian Research Council (ARC), Solving hard Chebyshev approximation problems through nonsmooth analysis (Discovery Project DP180100602 ). This research was partially sponsored by Tel Aviv-Swinburne Research Collaboration Grant (2019).