Dr. Mark Humphrys

School of Computing. Dublin City University.

Online coding site: Ancient Brain

coders   JavaScript worlds

Search:

Free AI exercises


History of Operating Systems




(1) In 1940s-50s, Program = Computer

At start, no Operating Systems.
Computers are like the abstract model of a machine in Automata theory. Running one program and nothing else. Programmer operates machine alone:

Interactive (great if you're the lone programmer).
But CPU idle for long periods (e.g. while program being revised, or when program halts/crashes when programmer not watching).
Long wait to use machine for other programmers.

Driving forces for change:
Lots of programmers wanting to use machine.
Computers expensive (any CPU idle time bad).



(2) In 1950s-60s, Operator hired to run computer.

Programmers have to submit jobs, receive results later.
Operator schedules jobs.
Jobs still stored on sequential (tape) access (no random access medium yet).
Sequential tapes of jobs are prepared by operator for the CPU to run through once.

Resident monitor:

Programming with an Operator controlling what runs:
Long queues, not interactive.
If error in program may have to wait days to retry it. Have to think things through in advance!

  

A program written in the language PL/I on punch cards around 1969.
From here.


Driving force for change:
Random access permanent storage medium (disk) becomes available.




(3) Pool of jobs on disk, random access.

OS can now implement the human operator's algorithms in deciding sequence of jobs to run.
Scheduling of jobs now totally automated (true OS).

Driving force for change:
Programs read from and write to disk and other devices.
I/O device speed is much slower than CPU speed, so CPU still often idle while I/O going on.



(4) Some parallelisation of I/O and computation.

Device controller is a piece of hardware that can work in parallel with the CPU.
Not a parallel computer - it is just a specialised device for data transfer, not a general-purpose computer.

Example: Spooling - Print device controller is still copying data from disk to printer while CPU has already begun working on next job.
Next job can begin while previous is still printing out.

Driving forces for change:
Wait times still too long.
Long jobs delay everything else. Could delay other jobs by hours.
Program might do I/O half-way through its execution (rather than only at start/end). When program stops for this I/O, CPU is idle.




(5) Multi-programming.

A series of breakthroughs: Beginning of modern computers.

Scheduling now has two types:

  


CPU scheduling explained by Julia Evans.
See all cartoons.


  
Driving forces for change:
The above is still a batch model (program runs, lots of I/O with devices, exits) not interactive programs (no "user interfaces" invented yet).
But now the program could be waiting on user input. Which could be a wait of hours, but that does not matter now.
Users could now interact with programs while they are running.

Computers cheaper - cheap dumb terminals available.
Humans' time is expensive - don't want them to wait.




(6) 1970s-80s. Interactive time-sharing on mainframes.

Multi-programming where the program may be waiting on a user.
OS will in the meantime service another program, which may be interacting with another user.
Result: Multiple users share the CPU.
If the time-slicing is quick enough, they all feel as if they have their own dedicated machine!

CPU kept busy.
Programmers happy. They can now revise their code and quickly run it again without waiting.

Above all: User interaction at run-time allows a whole world of programs that were never possible before.
"ls" and all the rest.

 


DEC VT100 terminal (1978).
This kind of computer terminal would be used to run programs on a mainframe shared with other users.
Users can now interact with programs at run-time!


Driving forces for change:
Real computers (not dumb terminals) get cheap.




(7) 1980s. Standalone PCs.

The dream at last of a real computer for every user.

An odd era, by modern standards. Because no network.
Internet not yet important.
The user alone with their PC.

Users no longer logging in to mainframes.
Few clashes with other users.
Multi-programming mainly used so single user can run multiple programs at same time.

 


The standalone PC world of DOS:
First generation IBM PC (1981).
From here.



The standalone PC world of early Windows:
Clip shows Solitaire, bundled with Windows from 1990.
- How office workers wasted time on PCs before the Internet.


  
Driving forces for change:
Internet becomes usable and useful.



(8) 1990s. Internet.

Web is killer app for Internet 1993.
Return to sharing with multiple users in access to mainframe - this time access to shared remote web server.
Web server has to time-slice many clients' requests, overlapping in time, again using multi-programming concepts.



The shared mainframe returns:
The Web explodes in 1993, and has changed everything by the end of the 1990s.


  
Driving forces for change:
Broadband.
Smartphones.



(9) 2000s

Broadband at home: New technologies enable broadband at home (replacing dialup).
Enables multimedia Internet:

  
Smartphones: Growth in mobile network capacity enables Internet on smartphones.
After many experiments, multi-touch explodes to take over smartphone Internet access:



The decline of dialup, 2006-08 (in Ireland).



The mobile world before 2005:
No or limited Internet.
Image from here.



The greatest product launch ever?
Steve Jobs introduces the iPhone at MacWorld 2007.
The modern mobile world begins.
Click through to video.
Go to 5:20 where he introduces the modern concept of multi-touch, which had a history but went mainstream with the iPhone.


  

ancientbrain.com      w2mind.org      humphrysfamilytree.com

On the Internet since 1987.      New 250 G VPS server.

Note: Links on this site to user-generated content like Wikipedia are highlighted in red as possibly unreliable. My view is that such links are highly useful but flawed.