I'm learning about Neural Networks using the NEAT algorithm. To understand the results I'm tryling to reproduce the calculations for a few values by hand using Excel.
Running the XOR demo of NEAT-Python I get the following results and the corresponding graph:
Fitness: 3.93262784315Nodes: 0 DefaultNodeGene(key=0, bias=-4.28711170619, response=1.0, activation=sigmoid, aggregation=sum) 1197 DefaultNodeGene(key=1197, bias=-1.20497802851, response=1.0, activation=sigmoid, aggregation=sum)Connections: DefaultConnectionGene(key=(-2, 0), weight=-1.83815166978, enabled=True) DefaultConnectionGene(key=(-2, 1197), weight=1.7368731859, enabled=True) DefaultConnectionGene(key=(-1, 0), weight=5.72212927441, enabled=True) DefaultConnectionGene(key=(-1, 1197), weight=-1.30171417401, enabled=True) DefaultConnectionGene(key=(1197, 0), weight=9.27762332932, enabled=True)Output:input (0.0, 0.0), expected output (0.0,), got [5.485892844789737e-10]input (0.0, 1.0), expected output (1.0,), got [0.9999970321654681]input (1.0, 0.0), expected output (1.0,), got [0.9992352946960754]input (1.0, 1.0), expected output (0.0,), got [0.2595603437828634]
Although I understand the basics of NN, I don't get the results of the command line:At first I calculate the value of node 1197 with the example of (0, 0):
-1.3 * 0 +1.7 * 0 -1.2 (bias)= -1.2sigmoid(-1.2) = 0.231475... (result of node 1197)
And then the final result:
5.7 * 0 +9.3 * 0.231475 (result node 1197) -1.8 * 0 -4.3 (bias)=-2.147282...sigmoid(-2.147282) = 0.104585...
Which is definitely not equal to the expected 5.485892844789737e-10.Can anyone find an error in my calculations?