At the forefront of Artificial Intelligence
  Home Articles Reviews Interviews JDK Glossary Features Discussion Search
Home » Articles » Neural Networks » Genetic Algorithms

Using Genetic Algorithms with Neural Networks

This essay is a very basic introduction about how to use genetic algorithms to help train neural networks. I also wrote a simple program to accompany the essay that uses genetic algorithms to evolve an XOR network.

Evolving Weights

This is the most common use of a genetic algorithm in conjunction with a neural network. Since genetic algorithms are excellent at searching a state-space, searching for neural network weights is an ideal application.

Simply set up the genetic algorithm to evolve a string of floating point numbers (within the range that you specify) that can be used as weights for the network. The biggest trouble with using a GA is specifying the range of the weights. Since we generally don't know the range we have to estimate, and use a trail and error method to correct/optimize them. Generally network weights should not be too big - for example, the XOR net weights don't get larger in the absolute than 3.

Another problem with genetic algorithms and neural networks is figuring out an appropriate method of reproducing and crossing over the weights. It all depends on how you have set your weights up. Going back to our example with the XOR net, our network is small and the weights are easily represented by a 3x3 array:

m_fWeights[3][3]

0	0	0
0	0	0
0	0	0

For this kind of set up, I swap over groups of weights. In the XORGA program, I select two population examples, one from the lower error (LE) half, and another from the higher error (HE) half. I then give the HE the LE weights for the final layer (the blue weights in the above diagram), and put it back in the population.

Yet if your neural network is a lot more complicated, then your representation of the weights will not be as simple. However you decide to do it, it is recommended that you keep weights grouped together, otherwise the resultant weights are as a good as random.

Mutation is also a genetic operator you should consider. In the example program, there is a rather high (10%) chance of mutation, and then the weights are altered by anything between -1 and 1.

When to Use

Genetic algorithms are an option, but they are not by any stretch always the best option. The genetic algorithm is a lot slower than back-propagation when applied to the XOR problem. The GA gives better results than the BP example used:

Back-Propagation:
0,0 = 0.0494681
0,1 = 0.955633
1,0 = 0.942529
1,1 = 0.0433488 
Genetic Algorithm:
0 xor 0 = 2.47602e-005
0 xor 1 = 0.997028
1 xor 0 = 0.999292
1 xor 1 = 0.010474
The genetic algorithm finds more accurate results, but the back-propagation is close to instantaneous, whereas the genetic algorithm will take anything between 5-20 seconds (233 Mhz test computer).

Architecture

Since the overall architecture of the network is imperative the operation, a lot of research has focused on using evolutionary techniques to evolve the best architecture (much like the evolution of our own brains). One simple method is to use a boolean NxN matrix, where N is the number of neurons in the network. Any given place on the matrix refers to a connection between neuron X and neuron Y. For example, for the XORNet:

   1  2  3  4  5  
1  0  0  1  1  0
2  0  0  1  1  0
3  0  0  0  0  1
4  0  0  0  0  1
5  0  0  0  0  0
This is a very simple method, and gets inefficient for the large networks that architectural optimization is often applied to. For a more information, see Melanie Mitchell's book An Introduction to Genetic Algorithms.

Conclusion

These are only two applications of GAs to neural networking - other areas include function minimizing, local minima avoidance and other "tweaking" techniques. Remember that any parameters can be evolved by a genetic algorithm, but how much it will affect the overall performance of the network varies from parameter to parameter - the weights are obviously the most important. Remember, only use a genetic algorithm when other training methods are inefficient, not practical or you feel that genetic algorithms will provide an advantage over other training methods.

Submitted: 31/05/2000

Article content copyright © James Matthews, 2000.
 Article Toolbar
Print
BibTeX entry

Search

Latest News
- The Latest (03/04/2012)
- Generation5 10-year Anniversary (03/09/2008)
- New Generation5 Design! (09/04/2007)
- Happy New Year 2007 (02/01/2007)
- Where has Generation5 Gone?! (04/11/2005)

What's New?
- Back-propagation using the Generation5 JDK (07/04/2008)
- Hough Transforms (02/01/2008)
- Kohonen-based Image Analysis using the Generation5 JDK (11/12/2007)
- Modelling Bacterium using the JDK (19/03/2007)
- Modelling Bacterium using the JDK (19/03/2007)


All content copyright © 1998-2007, Generation5 unless otherwise noted.
- Privacy Policy - Legal - Terms of Use -