Thanks for the input guys.

I'll check out those sites. I've started implementing an xml driven Backpropogation network. the netowork is defined by a specific xml schema and the training data will also be provided in xml. I'll be posting all the code on the Cerebral Bicycle site once I've got it working. The schema I have is this

Code:
<NeuralNet>
   <ActivationFunction></ActivationFunction>
  <Layers>
      <Layer>
           <Neuron>
               <Inputs>
                     <Input layer="" neuron="" weight=""/>
               </Inputs>
           </Neuron>
      </Layer>
  </Layers>
</NeuralNet>
in the Input section the layer is the index (zero based) of the layer we are inputing from and neuron is the index (zero based) of the neuron in that layer. This will allow me to define quite complex multi layer networks.

the ActivationFunction might be a piece of script or just the name of a pre defined funtion.

I have the feed forward mode written, just need to do the training part.

As for GA's breeding NN's, this is something I played with back at uni (I did my disteration on GA's and PBIL's ), it's a fantastic technique which allows you to bread the layout of the network rather than defining the path's manually. If I get a chance I'll implement that as well, and release all the code with that. However that might have to wait for now :cry:

Dean