There are many ways to consider the question: What is Artificial Intelligence (AI)?
Is it something that does not exist yet? Because machines are not actually intelligent.
Is it something that does exist already, but is not really intelligent?
Is it a description of many of the programs we use today?
Does it describe a particular type of program? Are there AI-type programs and non-AI programs?
How would they differ?
To get started thinking about what AI is, I propose we think bigger.
Engineering
How about this for a definition:
AI is arguably the hardest problem in the history of engineering:
How can we build an intelligent machine?
This begs the question: Is it possible to build an intelligent machine?
If it is possible, what is the path to doing it?
This places AI within Engineering (Computer Science, Robotics).
Science
Consider another view:
AI is arguably all about the hardest problem in the history of science:
How does the brain work?
This begs the question: Is there a way the brain works?
If the brain works a certain way, can we work it out and replicate that in a machine?
The problem (or the related set of problems) of
how the brain, intelligence, consciousness and "free will" work
is one of the great unsolved problems of science.
This places AI as playing an important role in Science (Cognitive Science, Neuroscience, Psychology, Ethology).
For an introduction to how science
addresses these questions, here is some reading:
The Mind's I,
Douglas R. Hofstadter and Daniel C. Dennett, 1981.
DCU Library, 155.2.
A mind-bending collection of essays exploring the possibilities
of Strong AI.
If Strong AI was true, could you be immortal?
Could you copy brains?
See chapters.
Brief talk
by
Marvin Minsky.
Artificial Life V. 1996.
Minksy says AI is the first time we have a proper language to talk about the brain.
"Computer Science is not about computers.
It's the first time in 5000 years that we've begun to have ways to describe the kinds of machinery
that we are."
Science fiction likes to portray AIs as a threat to humanity.
Certainly you could construct such an argument.
Though it is also true that it makes a better story, or a better joke.
Humour: "Voting Machines Elect One Of Their Own As President",
from The Onion.
Another view is that current AIs are crippled and helpless compared to living things.
They have very limited skills and limited knowledge of the world compared to any complex animal or human.
One of the issues in AI is how to give an AI anything as rich as an entire human childhood
(or even monkey childhood).
Coding AI by human is too hard. Machines must evolve and learn.
Robots v. Software
Software only AIs live in an imaginary world.
Real intelligence needs real world.
But robots are hard. Software only is easy.
Long childhoods
Typical AI learning and evolution experiments are meant to be done in a few hours, a week at most.
But in humans, learning in childhood lasts 18 years.
And evolution had over 3 billion years.
1 mind v. multiple minds
AIs should not have single thread of control.
There should be large numbers of parallel brains struggling for control,
competing with ideas.
Discussion: Do you exist?
A question for discussion that you may not have thought about:
Do you exist?
Where is the centre of "you", of your intelligence, your feeling of being alive?
People often propose that parts of the brain do certain jobs, and then there is some HQ inside the brain
where it all comes together.