Characterization theorem for best polynomial spline approximation with free knots, variable degree and fixed tails
- Authors: Crouzeix, Jean-Pierre , Sukhorukova, Nadezda , Ugon, Julien
- Date: 2017
- Type: Text , Journal article
- Relation: Journal of Optimization Theory and Applications Vol. 172, no. 3 (2017), p. 950-964
- Full Text:
- Reviewed:
- Description: In this paper, we derive a necessary condition for a best approximation by piecewise polynomial functions of varying degree from one interval to another. Based on these results, we obtain a characterization theorem for the polynomial splines with fixed tails, that is the value of the spline is fixed in one or more knots (external or internal). We apply nonsmooth nonconvex analysis to obtain this result, which is also a necessary and sufficient condition for inf-stationarity in the sense of Demyanov-Rubinov. This paper is an extension of a paper where similar conditions were obtained for free tails splines. The main results of this paper are essential for the development of a Remez-type algorithm for free knot spline approximation.
Nonsmooth DC programming approach to the minimum sum-of-squares clustering problems
- Authors: Bagirov, Adil , Taheri, Sona , Ugon, Julien
- Date: 2016
- Type: Text , Journal article
- Relation: Pattern Recognition Vol. 53, no. (2016), p. 12-24
- Relation: http://purl.org/au-research/grants/arc/DP140103213
- Full Text: false
- Reviewed:
- Description: This paper introduces an algorithm for solving the minimum sum-of-squares clustering problems using their difference of convex representations. A non-smooth non-convex optimization formulation of the clustering problem is used to design the algorithm. Characterizations of critical points, stationary points in the sense of generalized gradients and inf-stationary points of the clustering problem are given. The proposed algorithm is tested and compared with other clustering algorithms using large real world data sets. © 2015 Elsevier Ltd. All rights reserved.
Chebyshev approximation by linear combinations of fixed knot polynomial splines with weighting functions
- Authors: Sukhorukova, Nadezda , Ugon, Julien
- Date: 2016
- Type: Text , Journal article
- Relation: Journal of Optimization Theory and Applications Vol. 171, no. 2 (2016), p. 536-549
- Full Text: false
- Reviewed:
- Description: In this paper, we derive conditions for best uniform approximation by fixed knots polynomial splines with weighting functions. The theory of Chebyshev approximation for fixed knots polynomial functions is very elegant and complete. Necessary and sufficient optimality conditions have been developed leading to efficient algorithms for constructing optimal spline approximations. The optimality conditions are based on the notion of alternance (maximal deviation points with alternating deviation signs). In this paper, we extend these results to the case when the model function is a product of fixed knots polynomial splines (whose parameters are subject to optimization) and other functions (whose parameters are predefined). This problem is nonsmooth, and therefore, we make use of convex and nonsmooth analysis to solve it.
Nonsmooth optimization algorithm for solving clusterwise linear regression problems
- Authors: Bagirov, Adil , Ugon, Julien , Mirzayeva, Hijran
- Date: 2015
- Type: Text , Journal article
- Relation: Journal of Optimization Theory and Applications Vol. 164, no. 3 (2015), p. 755-780
- Relation: http://purl.org/au-research/grants/arc/DP140103213
- Full Text: false
- Reviewed:
- Description: Clusterwise linear regression consists of finding a number of linear regression functions each approximating a subset of the data. In this paper, the clusterwise linear regression problem is formulated as a nonsmooth nonconvex optimization problem and an algorithm based on an incremental approach and on the discrete gradient method of nonsmooth optimization is designed to solve it. This algorithm incrementally divides the whole dataset into groups which can be easily approximated by one linear regression function. A special procedure is introduced to generate good starting points for solving global optimization problems at each iteration of the incremental algorithm. The algorithm is compared with the multi-start Spath and the incremental algorithms on several publicly available datasets for regression analysis.