Your selections:

6Australian Digital Thesis
2Convex functions
2Nonsmooth optimization
1Alegbra
1Algorithms
1Applied mathematics
1Case studies
1Cluster analysis
1Cluster theory
1Data mining
1Functional analysis
1Lagrangian functions
1Mathematical statistics
1Microclusters
1Open Access
1Optimization
1Polynomial functions
1Stochastic processes

Show More

Show Less

Format Type

Application of nonsmooth optimisation to data analysis

**Authors:**Ugon, Julien**Date:**2005**Type:**Text , Thesis , PhD**Full Text:****Description:**The research presented in this thesis is two-fold: on the one hand, major data mining problems are reformulated as mathematical programming problems. These problems should be carefully designed, since from their formulation depends the efficiency, perhaps the existence, of the solvers. On the other hand, optimisation methods are adapted to solve these problems, most of which are nonsmooth and nonconvex. This part is delicate, as the solution is often required to be good and obtained fast. Numerical experiments on real-world datasets are presented and analysed.**Description:**Doctor of Philosophy

**Authors:**Ugon, Julien**Date:**2005**Type:**Text , Thesis , PhD**Full Text:****Description:**The research presented in this thesis is two-fold: on the one hand, major data mining problems are reformulated as mathematical programming problems. These problems should be carefully designed, since from their formulation depends the efficiency, perhaps the existence, of the solvers. On the other hand, optimisation methods are adapted to solve these problems, most of which are nonsmooth and nonconvex. This part is delicate, as the solution is often required to be good and obtained fast. Numerical experiments on real-world datasets are presented and analysed.**Description:**Doctor of Philosophy

Conditions for global minimum through abstract convexity

**Authors:**Sharikov, Evgenii**Date:**2008**Type:**Text , Thesis , PhD**Full Text:****Description:**The theory of abstract convexity generalizes ideas of convex analysis by using the notion of global supports and the global definition of subdifferential. In order to apply this theory to optimization, we need to extend subdifferential calculus and separation properties into the area of abstract convexity.**Description:**Doctor of Philosophy

**Authors:**Sharikov, Evgenii**Date:**2008**Type:**Text , Thesis , PhD**Full Text:****Description:**The theory of abstract convexity generalizes ideas of convex analysis by using the notion of global supports and the global definition of subdifferential. In order to apply this theory to optimization, we need to extend subdifferential calculus and separation properties into the area of abstract convexity.**Description:**Doctor of Philosophy

G-coupling functions and properties of strongly star-shaped cones

**Authors:**Morales-Silva, Daniel**Date:**2009**Type:**Text , Thesis , PhD**Full Text:****Description:**The main part of this thesis presents a new approach to the topic of conjugation, with applications to various optimization problems. It does so by introducing (what we call) G-coupling functions.**Description:**Doctor of Philosophy

**Authors:**Morales-Silva, Daniel**Date:**2009**Type:**Text , Thesis , PhD**Full Text:****Description:**The main part of this thesis presents a new approach to the topic of conjugation, with applications to various optimization problems. It does so by introducing (what we call) G-coupling functions.**Description:**Doctor of Philosophy

A class of Increasing Positively Homogeneous functions for which global optimization problem is NP-hard

**Authors:**Sultanova, Nargiz**Date:**2009**Type:**Text , Thesis , Masters**Full Text:****Description:**It is well known that global optimization problems are, generally speaking, computationally infeasible, that is solving them would require an unreasonably large amount of time and/or space. In certain cases, for example, when objective functions and constraints are convex, it is possible to construct a feasible algorithm for solving global optimization problem successfully. Convexity, however, is not a phenomenon to be often expected in the applications. Nonconvex problems frequently arise in many industrial and scienti¯c areas. Therefore, it is only natural to try to replace convexity with some other structure at least for some classes of nonconvex optimization problems to render the global optimization problem feasible. A theory of abstract convexity has been developed as a result of the above considerations. Monotonic analysis, a branch of abstract convex analysis, is analogous in many ways to convex analysis, and sometimes is even simpler. It turned out that many problems of nonconvex optimization encountered in applications can be described in terms of monotonic functions. The analogies with convex analysis were considered to aid in solving some classes of nonconvex optimization problems. In this thesis we will focus on one of the elements of monotonic analysis - Increasing Positively Homogeneous functions of degree one or in short IPH functions. The aim of present research is to show that finding the solution and ²-approximation to the solution of the global optimization problem for IPH functions restricted to a unit simplex is an NP-hard problem. These results can be further extended to positively homogeneous functions of degree ´, ´ > 0.**Description:**Master of Mathematical Sciences (Research)

**Authors:**Sultanova, Nargiz**Date:**2009**Type:**Text , Thesis , Masters**Full Text:****Description:**It is well known that global optimization problems are, generally speaking, computationally infeasible, that is solving them would require an unreasonably large amount of time and/or space. In certain cases, for example, when objective functions and constraints are convex, it is possible to construct a feasible algorithm for solving global optimization problem successfully. Convexity, however, is not a phenomenon to be often expected in the applications. Nonconvex problems frequently arise in many industrial and scienti¯c areas. Therefore, it is only natural to try to replace convexity with some other structure at least for some classes of nonconvex optimization problems to render the global optimization problem feasible. A theory of abstract convexity has been developed as a result of the above considerations. Monotonic analysis, a branch of abstract convex analysis, is analogous in many ways to convex analysis, and sometimes is even simpler. It turned out that many problems of nonconvex optimization encountered in applications can be described in terms of monotonic functions. The analogies with convex analysis were considered to aid in solving some classes of nonconvex optimization problems. In this thesis we will focus on one of the elements of monotonic analysis - Increasing Positively Homogeneous functions of degree one or in short IPH functions. The aim of present research is to show that finding the solution and ²-approximation to the solution of the global optimization problem for IPH functions restricted to a unit simplex is an NP-hard problem. These results can be further extended to positively homogeneous functions of degree ´, ´ > 0.**Description:**Master of Mathematical Sciences (Research)

Derivative-free hybrid methods in global optimization and their applications

**Authors:**Zhang, Jiapu**Date:**2005**Type:**Text , Thesis , PhD**Full Text:****Description:**In recent years large-scale global optimization (GO) problems have drawn considerable attention. These problems have many applications, in particular in data mining and biochemistry. Numerical methods for GO are often very time consuming and could not be applied for high-dimensional non-convex and / or non-smooth optimization problems. The thesis explores reasons why we need to develop and study new algorithms for solving large-scale GO problems .... The thesis presents several derivative-free hybrid methods for large scale GO problems. These methods do not guarantee the calculation of a global solution; however, results of numerical experiments presented in this thesis demonstrate that they, as a rule, calculate a solution which is a global one or close to it. Their applications to data mining problems and the protein folding problem are demonstrated.**Description:**Doctor of Philosophy

**Authors:**Zhang, Jiapu**Date:**2005**Type:**Text , Thesis , PhD**Full Text:****Description:**In recent years large-scale global optimization (GO) problems have drawn considerable attention. These problems have many applications, in particular in data mining and biochemistry. Numerical methods for GO are often very time consuming and could not be applied for high-dimensional non-convex and / or non-smooth optimization problems. The thesis explores reasons why we need to develop and study new algorithms for solving large-scale GO problems .... The thesis presents several derivative-free hybrid methods for large scale GO problems. These methods do not guarantee the calculation of a global solution; however, results of numerical experiments presented in this thesis demonstrate that they, as a rule, calculate a solution which is a global one or close to it. Their applications to data mining problems and the protein folding problem are demonstrated.**Description:**Doctor of Philosophy

Derivative free algorithms for nonsmooth and global optimization with application in cluster analysis

**Authors:**Ganjehlou, Asef Nazari**Date:**2009**Type:**Text , Thesis , PhD**Full Text:****Description:**This thesis is devoted to the development of algorithms for solving nonsmooth nonconvex problems. Some of these algorithms are derivative free methods.**Description:**Doctor of Philosophy

**Authors:**Ganjehlou, Asef Nazari**Date:**2009**Type:**Text , Thesis , PhD**Full Text:****Description:**This thesis is devoted to the development of algorithms for solving nonsmooth nonconvex problems. Some of these algorithms are derivative free methods.**Description:**Doctor of Philosophy

Global optimality conditions and optimization methods for polynomial programming problems and their applications

**Authors:**Tian, Jing**Date:**2014**Type:**Text , Thesis , PhD**Full Text:****Description:**The polynomial programming problem which has a polynomial objective function, either with no constraints or with polynomial constraints occurs frequently in engineering design, investment science, control theory, network distribution, signal processing and locationallocation contexts. Moreover, the polynomial programming problem is known to be Nondeterministic Polynomial-time hard (NP-hard). The polynomial programming problem has attracted a lot of attention, including quadratic, cubic, homogenous or normal quartic programming problems as special cases. Existing methods for solving polynomial programming problems include algebraic methods and various convex relaxation methods. Especially, among these methods, semidefinite programming (SDP) and sum of squares (SOS) relaxations are very popular. Theoretically, SDP and SOS relaxation methods are very powerful and successful in solving the general polynomial programming problem with a compact feasible region. However, the solvability in practice depends on the size or the degree of the polynomial programming problem and the required accuracy. Hence, solving large scale SDP problems still remains a computational challenge. It is well-known that traditional local optimization methods are designed based on necessary local optimality conditions, i.e., Karush-Kuhn-Tucker (KKT) conditions. Motivated by this, some researchers proposed a necessary global optimality condition for a quadratic programming problem and designed a new local optimization method according to the necessary global optimality condition. In this thesis, we try to apply this idea to cubic and quatic programming problems, and further to general unconstrained and constrained polynomial programming problems. For these polynomial programming problems, we will investigate necessary global optimality conditions and design new local optimization methods according to these conditions. These necessary global optimality conditions are generally stronger than KKT conditions. Hence, the obtained new local minimizers by using the new local optimization methods may improve some KKT points. Our ultimate aim is to design global optimization methods for these polynomial programming problems. We notice that the filled function method is one of the well-known and practical auxiliary function methods used to achieve a global minimizer. In this thesis, we design global optimization methods by combining the new proposed local optimization methods and some auxiliary functions. The numerical examples illustrate the efficiency and stability of the optimization methods. Finally, we discuss some applications for solving some sensor network localization problems and systems of polynomial equations. It is worth mentioning that we apply the idea and the results for polynomial programming problems to nonlinear programming problems (NLP). We provide an optimality condition and design new local optimization methods according to the optimality condition and design global optimization methods for the problem (NLP) by combining the new local optimization methods and an auxiliary function. In order to test the performance of the global optimization methods, we compare them with two other heuristic methods. The results demonstrate our methods outperform the two other algorithms.**Description:**Doctor of Philosophy

**Authors:**Tian, Jing**Date:**2014**Type:**Text , Thesis , PhD**Full Text:****Description:**The polynomial programming problem which has a polynomial objective function, either with no constraints or with polynomial constraints occurs frequently in engineering design, investment science, control theory, network distribution, signal processing and locationallocation contexts. Moreover, the polynomial programming problem is known to be Nondeterministic Polynomial-time hard (NP-hard). The polynomial programming problem has attracted a lot of attention, including quadratic, cubic, homogenous or normal quartic programming problems as special cases. Existing methods for solving polynomial programming problems include algebraic methods and various convex relaxation methods. Especially, among these methods, semidefinite programming (SDP) and sum of squares (SOS) relaxations are very popular. Theoretically, SDP and SOS relaxation methods are very powerful and successful in solving the general polynomial programming problem with a compact feasible region. However, the solvability in practice depends on the size or the degree of the polynomial programming problem and the required accuracy. Hence, solving large scale SDP problems still remains a computational challenge. It is well-known that traditional local optimization methods are designed based on necessary local optimality conditions, i.e., Karush-Kuhn-Tucker (KKT) conditions. Motivated by this, some researchers proposed a necessary global optimality condition for a quadratic programming problem and designed a new local optimization method according to the necessary global optimality condition. In this thesis, we try to apply this idea to cubic and quatic programming problems, and further to general unconstrained and constrained polynomial programming problems. For these polynomial programming problems, we will investigate necessary global optimality conditions and design new local optimization methods according to these conditions. These necessary global optimality conditions are generally stronger than KKT conditions. Hence, the obtained new local minimizers by using the new local optimization methods may improve some KKT points. Our ultimate aim is to design global optimization methods for these polynomial programming problems. We notice that the filled function method is one of the well-known and practical auxiliary function methods used to achieve a global minimizer. In this thesis, we design global optimization methods by combining the new proposed local optimization methods and some auxiliary functions. The numerical examples illustrate the efficiency and stability of the optimization methods. Finally, we discuss some applications for solving some sensor network localization problems and systems of polynomial equations. It is worth mentioning that we apply the idea and the results for polynomial programming problems to nonlinear programming problems (NLP). We provide an optimality condition and design new local optimization methods according to the optimality condition and design global optimization methods for the problem (NLP) by combining the new local optimization methods and an auxiliary function. In order to test the performance of the global optimization methods, we compare them with two other heuristic methods. The results demonstrate our methods outperform the two other algorithms.**Description:**Doctor of Philosophy

Global minimization of some classes of generalized convex functions

**Authors:**Andramonov, Mikhail**Date:**2001**Type:**Text , Thesis , PhD**Full Text:**false**Description:**"A number of methods of global optimization are proposed and their convergence is proved"**Description:**Doctor of Philosophy

Nonsmooth and derivative-free optimization based hybrid methods and applications

**Authors:**Long, Qiang**Date:**2014**Type:**Text , Thesis , PhD**Full Text:****Description:**"In this thesis, we develop hybrid methods for solving global and in particular, nonsmooth optimization problems. Hybrid methods are becoming more popular in global optimization since they allow to apply powerful smooth optimization techniques to solve global optimization problems. Such methods are able to efficiently solve global optimization problems with large number of variables. To date global search algorithms have been mainly applied to improve global search properties of the local search methods (including smooth optimization algorithms). In this thesis we apply rather different strategy to design hybrid methods. We use local search algorithms to improve the efficiency of global search methods. The thesis consists of two parts. In the first part we describe hybrid algorithms and in the second part we consider their various applications." -- taken from Abstract.**Description:**Operational Research and Cybernetics

**Authors:**Long, Qiang**Date:**2014**Type:**Text , Thesis , PhD**Full Text:****Description:**"In this thesis, we develop hybrid methods for solving global and in particular, nonsmooth optimization problems. Hybrid methods are becoming more popular in global optimization since they allow to apply powerful smooth optimization techniques to solve global optimization problems. Such methods are able to efficiently solve global optimization problems with large number of variables. To date global search algorithms have been mainly applied to improve global search properties of the local search methods (including smooth optimization algorithms). In this thesis we apply rather different strategy to design hybrid methods. We use local search algorithms to improve the efficiency of global search methods. The thesis consists of two parts. In the first part we describe hybrid algorithms and in the second part we consider their various applications." -- taken from Abstract.**Description:**Operational Research and Cybernetics

The cutting angle method and its applications

**Authors:**Bagirov, Adil**Date:**2001**Type:**Text , Thesis , PhD**Full Text:****Description:**"The main objective of this thesis is to develop and study new techniques for solving global optimization problems and to apply them to solving data classification problems."

**Authors:**Bagirov, Adil**Date:**2001**Type:**Text , Thesis , PhD**Full Text:****Description:**"The main objective of this thesis is to develop and study new techniques for solving global optimization problems and to apply them to solving data classification problems."

- «
- ‹
- 1
- ›
- »

Are you sure you would like to clear your session, including search history and login status?