Then the new population is generated; set P = NewP, G = G + 1; re

Then the new population is generated; set P = NewP, G = G + 1; return to Step 4. Step 10 . — Get the optimal neural network structure, and the iteration of genetic algorithm is terminated, Sunitinib price which means the optimizing stopped. Step 11 . — The new neural network’s weight learning is not sufficient, so use LMS method to further learn the weights. End of the algorithm. The significance of establishing new model is that to optimize neural network structure, to determine the number of hidden layer neurons and the center of the basis function, to optimize the connection weight and threshold, in order to improve the training speed and convergence, to save network running time, and then to improve

the operating efficiency of network and the ability of dealing with problems. 4. Experiment In order to verify the validity of the new algorithm,

we use several algorithms for comparison. And mark every algorithm as follows. The classical RBF algorithm, with least mean square (LMS) method to solve the weights from the hidden layer to output layer, is denoted by RBF. Use GA to optimize the network structure and weights of the RBF algorithm simultaneously; denote GA-RBF. Then use LMS method for weights further learning; get the algorithm; denote GA-RBF-L. Use training sample to train each algorithm and test by simulation sample. And then get six measurement indexes: training success rate, training error, test error, classification accuracy rate, number of hidden neurons, and operation time, so that we can measure the merits of the algorithm. 4.1. Test Preparation By using LMS

method to further learn the weights, the maximum number of iterations is 3,000, the learning rate is 0.1; the maximum size of the neural network is 90. The maximum number of GA iterations is 600, the population size is 50, the crossover rate is 0.9, and the mutation rate is 0.01. We use the C++ and Matlab for hybrid programming. In order to better illustrate the validity of new algorithm, we use two UCI data sets for testing; one data set is waveform database generator (V2) [17], and the other data is wine data set [18]. The experiments are run on Intel Core2 Duo CPU E7300 2.66GHz, RAM 1.99GB. 4.2. Test 1 The waveform database generator (V2) data set has 5000 samples, and each sample Entinostat has 40 features, which is used in waveform classification. In this paper, we select the front 600 samples to test, among 500 as training samples, the remaining 100 as the simulation samples. Every algorithm repeats the test 50 times and then records the best ones’ result. The results of each algorithm are listed in Table 1. Table 1 The comparison of the performance of each algorithm for waveform database. 4.3. Test 2 In order to further verify the validity of new algorithm, we use another UCI standard data set to test and also verify the generalization ability. The wine data set has 178 samples, 13 features, and 3 classes.

Leave a Reply

Your email address will not be published. Required fields are marked *

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>