Dr. Mark Humphrys

School of Computing. Dublin City University.

Home      Blog      Teaching      Research      Contact

Online coding site: Ancient Brain

coders   JavaScript worlds


CA170      CA318      CA686      CA686I

Online AI coding exercises

Project ideas

Deep learning

There has been a massive growth of neural networks after 2005.

New approaches

A series of new approaches to fixing memory, credit assignment, weight initialisation, more zero output nodes, GPUs for parallel hardware, and other modifications led to "deep neural networks".

Breakthrough paper for deep neural networks:
  1. "A fast learning algorithm for deep belief nets". Hinton, G. E., Osindero, S., & Teh, Y. W. (2006). Neural computation, 18(7), 1527-1554.
    Weights are initialized by training in a certain way rather than randomly.

Rectifier function: Zero-output nodes

Deep learning discovered issues in using the Sigmoid function and other continuous functions as the activation function.

It has been discovered that a much simpler activation function, the Rectifier function, has important properties for deep neural networks.
The Rectifier function is:

f(x) = max(0,x)

The "Rectifier" activation function (blue).
From here.


Impact on neural network learning:


Parallel Hardware: Neural networks on GPUs

Clearly a neural network maps perfectly to parallel hardware. It consists almost entirely of simple calculations that could be done in parallel, with a CPU at each node.

It is very wasteful to implement a neural network on serial hardware.

Modern computers already have massively parallel systems for doing simple calculations: GPUs.
So implementing neural networks on GPUs became an important part of Deep Learning.


Other hardware

Neural networks in JS on GPU


Click to run World: canvas webgl at Ancient Brain.


Sample applications of Neural networks

Image recognition on Ancient Brain

Click to run World: Recognise any image at Ancient Brain.
Opens in new window.

The brain

The brain has 100 billion neurons, each with up to 15,000 connections with other neurons. (Actually these figures include the entire nervous system, distributed over the body, which can be seen as an extension of the brain).

The adult brain of H.Sapiens is the most complex known object in the universe. Perhaps the most complex object that has ever existed. One brain is far more complex than the entire world telephone system / Internet (which has smaller number of nodes, and much less connectivity).

If we considered each neuron as roughly the equivalent of a simple CPU with 100 k of memory, then we have 100 billion CPUs with 10,000 terabytes of memory, all working in parallel and massively interconnected with hundreds of trillions of connections.

It is not surprising that the brain is so complex and at the same time consciousness and intelligence are mysterious. What would be surprising would be if the brain was a simple object.


ancientbrain.com      w2mind.org      humphrysfamilytree.com

On the Internet since 1987.      New 200 G VPS server.

Note: Links on this site to user-generated content like Wikipedia are highlighted in red as possibly unreliable. My view is that such links are highly useful but flawed.