A Brief History of Computing
Origins and Development of Computer Systems
Early Computing Machines
The early computing machines were primarily
devices for performing arithmetic calculations such as addition and multiplication.
- 3000 BCE: the Abacus
- 1642: Blaise Pascal and the Pascaline
- 1820: the Joseph-Marie Jacquard loom
- 1822: Charles Babbage, Augusta Ada and the Difference Engines
- 1889: Herman Hollerith and Punch Cards
- 1936: Alan Turing and the Turing Machine
The first generation of electronic computers
In the 1940's the first electronic computers were designed and implemented.
These machines would typically fill an entire room and were constructed using
thousands of vacuum tubes.
- 1940: John V. Atanasoff
- 1941: Konrad Zuse
- 1943: the Colossus
- 1944: the ENIAC
- 1945: John von Neumann, the EDVAC, and the Stored Program Computer
- 1951: the ERA 1103 (designed partly by
Seymour Cray
)
The Second generation: Transistor-based computers
In the last 50's, the transistor was used to replace vacuum tubes and computers
became smaller, faster, and more reliable. The 60's saw the development of
Very Large Scale Integration (VLSI) chips which contained hundreds of thousands
of transistors.
- 1956 the Stretch and LARC computers using Transistors
- 1956 Univac 1103A
- 1958 IBM 1401 and IBM 7094
The Third generation: Integrated Circuits, Mainframes, Minicomputers, and Operating Systems
- Mainframes (leading to supercomputers)
- 1960: CDC 1604 mainframe
- 1964: IBM System/360 mainframe computer
- 1976: Cray 1 supercomputer 10M multiplications/sec
- 1985: Cray XMP 1G mults/sec
- Minicomputers (leading to workstations)
- 1961: DEC PDP-1, minicomputer, 4K 18-bit words of memory, $120,000
- 1969: UNIX developed by Kenneth Thompson on PDP-7
- 1973: UNIX almost complete rewritten in new language C for PDP-11
- 1975: UNIX license
The Fourth Generation: Personal Computers and Workstations
During the 80's, the personal computer market developed and personal
computers became one of the fastest growing markets for computers.
- PCs
- 1972: HP introduces HP-35 handheld scientific calculator (slide rules are obsolete)
- 1975: Altair Personal Computer, 256 bytes of memory, no disk, screen, or keyboard: $400.
Bill Gates wrote a BASIC interpreter for the Altair.
- 1981: IBM PC introduced with MS-DOS software (64K bytes of memory,
160K byte floppy, screen and keyboard. Flat file system
- 1982: IBM PC/XT: with a hard disk and hierarchical file system.
- 1984: IBM PC/AT: 16M bytes of memory, 1.2M byte floppy, 10 Mb hard disk.
- 1987: IBM PS/2: with MS-DOS or OS/2
- 1991: Microsoft drops OS/2, IBM drops Microsoft
- 1995: Windows 95
- 1998: Windows 98
- 2000: Windows ME
- 2001: Windows XP
- Apples
- 1976: Apple I introduced
- 1977: Apple II debuted at trade show
- 1980: Apple III released
- 1984: Macintosh introduced during Super Bowl
- 1987: Mac II
- 1991: Powerbook
- 2000: G4 cube
- 2001: IBook, Ipod, ...
- Workstations (Here we follow just a few companies.)
- 1982: Sun Microsystems introduces Sun-1 workstation (running UNIX)
- 1982: Hewlett-Packard introduces the HP-9000 desktop mainframe
- 1984: Silicon Graphics ships first workstations (running UNIX)
- 1989: Sun introduces SPARCstation 1, 12.5 MIPS
- 1995: Sun releases the Java language
The Fifth Generation: Supercomputers, Parallel Computers, Meta-computers
In the 80's, the
supercomputer model was challenged by the parallel computation model, and this
trend has continued into the 90's.
A parallel machine consists of several moderately fast computers that are
connected in some manner that allows them to work cooperatively on a single
problem.
The most common model of parallel computer these days consists
of off-the-shelf workstations connected by an extremely fast network.
- Cray - vector machines
The high end of the computing market was the domain of the supercomputers.
Cray championed the monolithic supercomputer and often had the fastest
and most powerful central processing units in the world.
- Sequent --
Sequent was a typical early pioneer in this commercial market and
sold machines with around 16 processors that had a single shared memory.
- BBN butterfly--
The BBN butterfly was a more powerful machine with up to 128 computers connected
by a very fast, highly interconnected network.
- Thinking Machines "Connection Machine" - massive paralleism
The Connection Machine continued
this trend and contained 65,536 custom-made computers connected using a 16 dimensional
hypercube pattern!
- IBM SP --
this consists of up to a few hundred off-the-shelf IBM workstations connected via a fast
network.
- ASCI Red --
this consists of about 10,000 Pentium Pro processors connected in a
pair of 2D grids. This latter machine is part of a program by the defense department
to develop computers that can simulate nuclear warhead explosions from first
principles, and thereby diminish the need for testing in the development and
maintenance of our nuclear weapon stockpile.
ASCI White (12 Teraflops, 6 Terabytes, 8K processors) and
ASCI Blue
are two other more
recent massively parallel computers.
- The internet as a supercomputer -- recently large numbers of computers communicating
over the internet have been used to solve extremely complex problems.
The late 90's: Embedded Computing and the World Wide Web
In the late 90's we are seeing the spread of computers into almost all
objects we interact with. These smart devices have the potential to
dramatically change the way we interact with our environment, for better
or worse.
- appliances
- answering machines
- pagers
- credit cards
- cell phones
The 90's have also seen the rise of the world wide web and in the early 00's
we are seeing the web extending to mobile devices such as cell phones as well
as high speed links into the home.
Origins and Development of Programming Languages
Origins of Programming Languages
The very first programming languages were machine languages. These
are the "native tongues" of the Central Processing Units (CPUs). The words
of this language are called machine instructions and consist of
one or more bytes which represent a coded form of some primitive operation
that the CPU can carry out.
The next programming languages were the so called
"assembly languages." These languages allowed users to write programs using
symbolic representations of the machine instructions and these assembly programs
could then be translated into machine language (i.e. coded bytes) either by
hand or, better yet, by a program. Each assembly instruction corresponded
precisely to one particular operation in the CPU's repetoire.
The next group of languages allowed the programmer to work at a more
conceptual level. Programs written in these higher level languages
could be mechanically translated (either by hand or by a program)
into assembly language, which could then be translated into machine language.
Many of the first high level languages are still around today (albeit in
a much evolved state). A few of the most influential high level languages are listed below:
- 1957 Fortran (used for scientific calculation)
- 1958 Algol (used in computer science)
- 1960 LISP (used for early Artificial Intelligence programs)
- 1960 Cobol (used for business applications)
- 1962 APL (used for mathematical/scientific applications)
- 1962 Simula (used to write simulation programs)
- 1964 Basic (used by hobbyists)
- 1970 Prolog (used in computer science)
- 1972 C (the main programming language of UNIX machines)
- 1975 Pascal (a language for teaching computer science)
- 1975 Scheme (a simpler dialect of LISP)
- 1986 C++ (an object-oriented version of C)
- 1995 Java (used for applet and other web programming)
- 1998 Jscheme (used in CS2a, implementation of Scheme in Java)
There are currently over 2000 computer languages. (A
searchable list is maintained at
http://cuiwww.unige.ch/langlist. Another
interesting list of languages (kept by an enthusiast)
is
"http://www.heuse.com/coding.htm").
In the next several weeks we will study a dialect
of Scheme which can be used to write applets.
Compilers and Interpreters
All of these high level languages must first be either translated
into machine language (by another program called a compiler), or
interpreted (by a program appropriately called an interpreter).
For example, Fortran, C, and C++ are usually compiled into assembly
language. Scheme is usually interpreted (but is sometimes compiled),
while Java is compiled into a simpler intermediate language (called Java byte code)
which is then interpreted (by a Java virtual machine).
For example, lets see how to write, compile, and run a simple C program.
The first step is to create a file, say "test.c" containing the following
lines:
#include <stdio.h>
int main(void) {
int i=100;
while (i>0){
printf(" HELLO WORLD!\n");
i = i - 1;
}
}
Next, one compiles the file (in Linux or Unix) by giving the following
command
% cc test.c -o test
The C compiler is stored in the file named "cc" and it reads the file "test.c"
and creates a new file "test" (as specified using the "-o test" flag). This
file contains machine language. We can run the program by giving the
command "test" to the Linux shell:
% ./test
HELLO WORLD!
HELLO WORLD!
HELLO WORLD!
...
This program writes HELLO WORLD! one hundred times and then stops.
Lets look more closely at what is happening here.
First of all, your command "test" is read by the shell.
Recall that the shell is the part of the operating system
that prints the percent sign prompt (%)
and then reads and executes the commands you type in. When you type in a command
it looks for a file that has that name and then it loads that file into the memory
and gives it temporary control of the CPU and sets a timer to go off in a few milliseconds
at which point control will be taken away from your program and given back to the
operating system program. The operating system then lets another program run for a few
milliseconds. When your program stops, the operating system takes it out of its
process queue, which is the list of "programs waiting to run." Also, when every your program
needs to write to the screen, it gives the data that needs to be displayed to the operating
system which then takes control of the CPU and displays the data.
An interpreter takes a different approach. It is a program which reads your program,
analyzes it, "simulates its execution," gets an answer, and displays that answer.
In this class we will be writing Scheme programs which are run by a Scheme interpreter
written in Java.
Compilers and interpreters are usually themselves written in a
high level language. If there already exists one compiler for
the language, then it is easy to write a second one in that language.
Compiler writers often write a first compiler for a new language L
(in some existing language say C). They then write their second
and later compilers for L in L itself.
This process is called bootstrapping.
Related Sites
-
The ENIAC website at UPenn
-
Charles Babbage Institute Center for the History of Computing
-
the Virtual Museum of Computing at Oxford University
-
Computers: From the Past to the Present
by Michelle A. Hoyle, CS Grad Student at U. of Sussex
-
Computer Programming Languages
--- in the WWW Virtual Library
-
Apple history
..
other Applet History links