XOR multi-layer network on Ancient Brain
A multi-layer network can implement XOR. A single-layer network cannot.
Here is a multi-layer network that starts random and learns
XOR from exemplars.
Click to run World:
XOR multi-layer network at
Ancient Brain.
- Recall
XOR
- Input is 2-dimensional, taking values 0 or 1.
- Output is 1-dimensional, 0 or 1.
- Credit:
- This is a modified port of a neural network to do XOR by the Coding Train.
- Code from here.
- Uses two libraries from here.
- See video.
- See long playlist
of Coding Train videos:
"Neural Networks"
explaining the entire program and supporting libraries.
- See Coding Train live demo.
JS libraries
Colour coding
- We draw the two dimensions and show the output with colour coding between black (0) and white (1).
See the code:
fill ( y * 255 );
- Coding Train draws x and y slightly odd. The origin (0,0) is Top LHS. y goes downwards.
So (1,1) is bottom RHS.
- Coding Train shows the entire function that the network implements
- a function of real numbers from 0 to 1.
-
My port has a "mask" that you can turn on to just show the corner points. Change this to show all points:
var showall = true;
- Coding Train has a different colour for every point.
-
I divide the 2D area into squares.
I take one point to represent each square.
The square gets the colour of that point.
Initial weights
- I have changed the code to allow different experiments with initial weights.
-
Note in
nn.js
there are
calls to Matrix.randomize().
- I edited Matrix.randomize() in
matrix.js so it
calls a function randomWeight().
- We then define randomWeight()
in the tweaker's box.
And we can experiment with different weight initialisations.