Chebyshev approximation by linear combinations of fixed knot polynomial splines with weighting functions
- Authors: Sukhorukova, Nadezda , Ugon, Julien
- Date: 2016
- Type: Text , Journal article
- Relation: Journal of Optimization Theory and Applications Vol. 171, no. 2 (2016), p. 536-549
- Full Text: false
- Reviewed:
- Description: In this paper, we derive conditions for best uniform approximation by fixed knots polynomial splines with weighting functions. The theory of Chebyshev approximation for fixed knots polynomial functions is very elegant and complete. Necessary and sufficient optimality conditions have been developed leading to efficient algorithms for constructing optimal spline approximations. The optimality conditions are based on the notion of alternance (maximal deviation points with alternating deviation signs). In this paper, we extend these results to the case when the model function is a product of fixed knots polynomial splines (whose parameters are subject to optimization) and other functions (whose parameters are predefined). This problem is nonsmooth, and therefore, we make use of convex and nonsmooth analysis to solve it.
Nonsmooth optimization algorithm for solving clusterwise linear regression problems
- Authors: Bagirov, Adil , Ugon, Julien , Mirzayeva, Hijran
- Date: 2015
- Type: Text , Journal article
- Relation: Journal of Optimization Theory and Applications Vol. 164, no. 3 (2015), p. 755-780
- Relation: http://purl.org/au-research/grants/arc/DP140103213
- Full Text: false
- Reviewed:
- Description: Clusterwise linear regression consists of finding a number of linear regression functions each approximating a subset of the data. In this paper, the clusterwise linear regression problem is formulated as a nonsmooth nonconvex optimization problem and an algorithm based on an incremental approach and on the discrete gradient method of nonsmooth optimization is designed to solve it. This algorithm incrementally divides the whole dataset into groups which can be easily approximated by one linear regression function. A special procedure is introduced to generate good starting points for solving global optimization problems at each iteration of the incremental algorithm. The algorithm is compared with the multi-start Spath and the incremental algorithms on several publicly available datasets for regression analysis.
On local coincidence of a convex set and its tangent cone
- Authors: Meng, Kaiwen , Roshchina, Vera , Yang, Xiaoqi
- Date: 2015
- Type: Text , Journal article
- Relation: Journal of Optimization Theory and Applications Vol. 164, no. 1 (2015), p. 123-137
- Full Text: false
- Reviewed:
- Description: In this paper, we introduce the exact tangent approximation property for a convex set and provide its characterizations, including the nonzero extent of a convex set. We obtain necessary and sufficient conditions for the closedness of the positive hull of a convex set via a limit set defined by truncated upper level sets of the gauge function. We also apply the exact tangent approximation property to study the existence of a global error bound for a proper, lower semicontinuous and positively homogeneous function.
Subgradient Method for Nonconvex Nonsmooth Optimization
- Authors: Bagirov, Adil , Jin, L. , Karmitsa, Napsu , Al Nuaimat, A. , Sultanova, Nargiz
- Date: 2012
- Type: Text , Journal article
- Relation: Journal of Optimization Theory and Applications Vol.157, no.2 (2012), p.416–435
- Full Text: false
- Reviewed:
- Description: In this paper, we introduce a new method for solving nonconvex nonsmooth optimization problems. It uses quasisecants, which are subgradients computed in some neighborhood of a point. The proposed method contains simple procedures for finding descent directions and for solving line search subproblems. The convergence of the method is studied and preliminary results of numerical experiments are presented. The comparison of the proposed method with the subgradient and the proximal bundle methods is demonstrated using results of numerical experiments. © 2012 Springer Science+Business Media, LLC.
Global Optimality Conditions and Optimization Methods for Quadratic Knapsack Problems
- Authors: Wu, Zhiyou , Yang, Y. J. , Bai, Fusheng , Mammadov, Musa
- Date: 2011
- Type: Text , Journal article
- Relation: Journal of Optimization Theory and Applications Vol. 151, no. 2 (2011), p. 241-259
- Full Text: false
- Reviewed:
- Description: The quadratic knapsack problem (QKP) maximizes a quadratic objective function subject to a binary and linear capacity constraint. Due to its simple structure and challenging difficulty, it has been studied intensively during the last two decades. This paper first presents some global optimality conditions for (QKP), which include necessary conditions and sufficient conditions. Then a local optimization method for (QKP) is developed using the necessary global optimality condition. Finally a global optimization method for (QKP) is proposed based on the sufficient global optimality condition, the local optimization method and an auxiliary function. Several numerical examples are given to illustrate the efficiency of the presented optimization methods. © 2011 Springer Science+Business Media, LLC.
Necessary Optimality Conditions and New Optimization Methods for Cubic Polynomial Optimization Problems with Mixed Variables
- Authors: Wu, Zhiyou , Quan, Jing , Li, G. Q. , Tian, Jing
- Date: 2011
- Type: Text , Journal article
- Relation: Journal of Optimization Theory and Applications Vol. , no. (2011), p. 1-28
- Full Text: false
- Reviewed:
- Description: Multivariate cubic polynomial optimization problems, as a special case of the general polynomial optimization, have a lot of practical applications in real world. In this paper, some necessary local optimality conditions and some necessary global optimality conditions for cubic polynomial optimization problems with mixed variables are established. Then some local optimization methods, including weakly local optimization methods for general problems with mixed variables and strongly local optimization methods for cubic polynomial optimization problems with mixed variables, are proposed by exploiting these necessary local optimality conditions and necessary global optimality conditions. A global optimization method is proposed for cubic polynomial optimization problems by combining these local optimization methods together with some auxiliary functions. Some numerical examples are also given to illustrate that these approaches are very efficient. © 2011 Springer Science+Business Media, LLC.
Uniform approximation by the highest defect continuous polynomial splines : Necessary and sufficient optimality conditions and their generalisations
- Authors: Sukhorukova, Nadezda
- Date: 2010
- Type: Text , Journal article
- Relation: Journal of Optimization Theory and Applications Vol. 147, no. 2 (2010), p. 378-394
- Full Text: false
- Reviewed:
- Description: In this paper necessary and sufficient optimality conditions for uniform approximation of continuous functions by polynomial splines with fixed knots are derived. The obtained results are generalisations of the existing results obtained for polynomial approximation and polynomial spline approximation. The main result is two-fold. First, the generalisation of the existing results to the case when the degree of the polynomials, which compose polynomial splines, can vary from one subinterval to another. Second, the construction of necessary and sufficient optimality conditions for polynomial spline approximation with fixed values of the splines at one or both borders of the corresponding approximation interval. © 2010 Springer Science+Business Media, LLC.
Global optimality conditions for some classes of optimization problems
- Authors: Wu, Zhiyou , Rubinov, Alex
- Date: 2009
- Type: Text , Journal article
- Relation: Journal of Optimization Theory and Applications Vol. 145, no. 1 (2009), p. 164-185
- Full Text: false
- Reviewed:
- Description: We establish new necessary and sufficient optimality conditions for global optimization problems. In particular, we establish tractable optimality conditions for the problems of minimizing a weakly convex or concave function subject to standard constraints, such as box constraints, binary constraints, and simplex constraints. We also derive some new necessary and sufficient optimality conditions for quadratic optimization. Our main theoretical tool for establishing these optimality conditions is abstract convexity. © 2009 Springer Science+Business Media, LLC.
Generalized Fenchel's conjugation formulas and duality for abstract convex functions
- Authors: Jeyakumar, Vaithilingam , Rubinov, Alex , Wu, Zhiyou
- Date: 2007
- Type: Text , Journal article
- Relation: Journal of Optimization Theory and Applications Vol. 132, no. 3 (Mar 2007), p. 441-458
- Full Text: false
- Reviewed:
- Description: In this paper, we present a generalization of Fenchel's conjugation and derive infimal convolution formulas, duality and subdifferential (and epsilon-subdifferential) sum formulas for abstract convex functions. The class of abstract convex functions covers very broad classes of nonconvex functions. A nonaffine global support function technique and an extended sum- epiconjugate technique of convex functions play a crucial role in deriving the results for abstract convex functions. An additivity condition involving global support sets serves as a constraint qualification for the duality.
- Description: C1
Sufficient conditions for global optimality of bivalent nonconvex quadratic programs with inequality constraints
- Authors: Wu, Zhiyou , Jeyakumar, Vaithilingam , Rubinov, Alex
- Date: 2007
- Type: Text , Journal article
- Relation: Journal of Optimization Theory and Applications Vol. 133, no. 1 (2007), p. 123-130
- Full Text: false
- Reviewed:
- Description: We present sufficient conditions for the global optimality of bivalent nonconvex quadratic programs involving quadratic inequality constraints as well as equality constraints. By employing the Lagrangian function, we extend the global subdifferential approach, developed recently in Jeyakumar et al. (J. Glob. Optim., 2007, to appear; Math. Program. Ser. A, 2007, to appear) for studying bivalent quadratic programs without quadratic constraints, and derive global optimality conditions. © 2007 Springer Science+Business Media, LLC.
- Description: C1
On global optimality conditions via separation functions
- Authors: Rubinov, Alex , Uderzo, A.
- Date: 2001
- Type: Text , Journal article
- Relation: Journal of Optimization Theory and Applications Vol. 109, no. 2 (May 2001), p. 345-370
- Full Text: false
- Reviewed:
- Description: The paper examines some axiomatic definitions of separation functions that can be employed fruitfully in the analysis of side-constrained extremum problems. A study of their general properties points out connections with abstract convex analysis and recent generalizations of Lagrangian approaches to duality and exact penalty methods. Many concrete examples are brought out.