A hybrid neural learning algorithm using evolutionary learning and derivative free local search method
- Authors: Ghosh, Ranadhir , Yearwood, John , Ghosh, Moumita , Bagirov, Adil
- Date: 2006
- Type: Text , Journal article
- Relation: International Journal of Neural Systems Vol. 16, no. 3 (2006), p. 201-213
- Full Text: false
- Reviewed:
- Description: In this paper we investigate a hybrid model based on the Discrete Gradient method and an evolutionary strategy for determining the weights in a feed forward artificial neural network. Also we discuss different variants for hybrid models using the Discrete Gradient method and an evolutionary strategy for determining the weights in a feed forward artificial neural network. The Discrete Gradient method has the advantage of being able to jump over many local minima and find very deep local minima. However, earlier research has shown that a good starting point for the discrete gradient method can improve the quality of the solution point. Evolutionary algorithms are best suited for global optimisation problems. Nevertheless they are cursed with longer training times and often unsuitable for real world application. For optimisation problems such as weight optimisation for ANNs in real world applications the dimensions are large and time complexity is critical. Hence the idea of a hybrid model can be a suitable option. In this paper we propose different fusion strategies for hybrid models combining the evolutionary strategy with the discrete gradient method to obtain an optimal solution much quicker. Three different fusion strategies are discussed: a linear hybrid model, an iterative hybrid model and a restricted local search hybrid model. Comparative results on a range of standard datasets are provided for different fusion hybrid models. © World Scientific Publishing Company.
- Description: C1
- Description: 2003001712
A fully automated breast cancer recognition system using discrete-gradient based clustering and multi category feature selection
- Authors: Ghosh, Ranadhir , Ghosh, Moumita , Yearwood, John
- Date: 2005
- Type: Text , Journal article
- Relation: Journal of Advanced Computational Intelligence and Intelligent Informatics Vol. 9, no. 3 (2005), p. 244-256
- Full Text: false
- Reviewed:
- Description: Advances in machine intelligence have provided a whole new window of opportunities in medical research. Building a fully automated computer aided diagnostic system for digital mammograms is just one of them. Given some success with semi-automated systems earlier, a fully automated CAD system is just another step forward. A proper combination of a feature selection model and a classifier for those areas of a mammogram marked by radiologists has been very successful. However a fully automated system with only two modules is a time consuming process as the suspicious areas in a mammogram can be quite small when compared to the whole image. Thus an additional clustering process can help in reducing the time complexity of the overall process. In this paper we propose a fast clustering process to identify suspicious areas. Another novelty of this paper is a multi-category feature selection approach. The choice of features to represent the patterns affects several aspects of pattern recognition problems such as accuracy, required learning time and the required number of samples. In this paper we propose a hybrid canonical based feature extraction technique as a combination of an evolutionary algorithm based classifier with a feed forward MLP model.
- Description: C1
- Description: 2003001358
Comparative analysis of genetic algorithm, simulated annealing and cutting angle method for artificial neural networks
- Authors: Ghosh, Ranadhir , Ghosh, Moumita , Yearwood, John , Bagirov, Adil
- Date: 2005
- Type: Text , Journal article
- Relation: Machine Learning and Data Mining in Pattern Recognition, Proceedings Vol. 3587, no. (2005), p. 62-70
- Full Text: false
- Reviewed:
- Description: Neural network learning is the main essence of ANN. There are many problems associated with the multiple local minima in neural networks. Global optimization methods are capable of finding global optimal solution. In this paper we investigate and present a comparative study for the effects of probabilistic and deterministic global search method for artificial neural network using fully connected feed forward multi-layered perceptron architecture. We investigate two probabilistic global search method namely Genetic algorithm and Simulated annealing method and a deterministic cutting angle method to find weights in neural network. Experiments were carried out on UCI benchmark dataset.
- Description: C1
- Description: 2003003398
Hybridization of neural learning algorithms using evolutionary and discrete gradient approaches
- Authors: Ghosh, Ranadhir , Yearwood, John , Ghosh, Moumita , Bagirov, Adil
- Date: 2005
- Type: Text , Journal article
- Relation: Journal of Computer Science Vol. 1, no. 3 (2005), p. 387-394
- Full Text: false
- Reviewed:
- Description: In this study we investigated a hybrid model based on the Discrete Gradient method and an evolutionary strategy for determining the weights in a feed forward artificial neural network. Also we discuss different variants for hybrid models using the Discrete Gradient method and an evolutionary strategy for determining the weights in a feed forward artificial neural network. The Discrete Gradient method has the advantage of being able to jump over many local minima and find very deep local minima. However, earlier research has shown that a good starting point for the discrete gradient method can improve the quality of the solution point. Evolutionary algorithms are best suited for global optimisation problems. Nevertheless they are cursed with longer training times and often unsuitable for real world application. For optimisation problems such as weight optimisation for ANNs in real world applications the dimensions are large and time complexity is critical. Hence the idea of a hybrid model can be a suitable option. In this study we propose different fusion strategies for hybrid models combining the evolutionary strategy with the discrete gradient method to obtain an optimal solution much quicker. Three different fusion strategies are discussed: a linear hybrid model, an iterative hybrid model and a restricted local search hybrid model. Comparative results on a range of standard datasets are provided for different fusion hybrid models.
- Description: C1
- Description: 2003001357