XOR multi-layer network on Ancient Brain
A multi-layer network can implement XOR. A single-layer network cannot.
Here is a multi-layer network that starts random and learns
XOR from exemplars.
Click to run World:
XOR multi-layer network at
Ancient Brain.
What we are looking at in a run
- Recall
XOR
- Input is 2-dimensional, taking values 0 or 1.
- Output is 1-dimensional, 0 or 1.
- We draw the 2 dimensions and show the output with colour coding between black (0) and white (1).
See the code:
fill ( y * 255 );
- Coding Train draws x and y slightly odd. The origin (0,0) is Top LHS. y goes downwards.
So (1,1) is bottom RHS.
- Coding Train also shows the entire function that the network implements
- a function of real numbers from 0 to 1.
-
My port has a "mask" that you can turn on to just show the corner points:
var showall = false;
Changes to code
- I made quite a few changes in the port.
- See the "tweaker's box" at the top.
- I have changed the code to allow different experiments with initial weights:
-
Note in
nn.js
there are
calls to Matrix.randomize().
- I edited Matrix.randomize() in
matrix.js so it
calls a function randomWeight().
- We then define randomWeight()
in the tweaker's box.
And we can experiment with different weight initialisations.