History of Computing

1. Earliest Traditions

Men need to keep track of things - whether it was the number of things, the measure of distance, or weight, or time. He used the digits of his hands as the first counting device. From this was born our decimal number system based on 10.

2. Stonehenge

One of the first devices extant to keep track of days was the structure built in Stonehenge. Stonehenge is 13 km north of Salisbury, England. The stone structure that was constructed there around 2800 B.C. captured the light from the summer solstice in a specific fashion.

3. Abacus

The abacus was one of the first adding machines. The abacus is made out of beads strung by several wires. The position of a bead determines its value. Thus a few beads are required to represent large numbers. Contrast this to the Roman system of counting where different symbols were used to represent larger and larger numbers.

4. Forefathers of Computing Science

Not much progress was made in the development of machines for computing. Mathematicians developed theorems but calculations were still done by hand. Three individuals whose visions inspired computing machines are - Blaise Pascal (1623-1662), Gottfried Wilhelm von Leibniz (1646-1716), and Charles Babbage (1791-1871).

5. Pascal's Pascaline Calculator

Pascal invented a machine that had a system of gears. A one-tooth gear engages its single tooth with a ten-tooth gear once every time it revolves. It must make ten revolutions to the rotate the ten-tooth gear once. Numbers could be entered and cumulative sums obtained by cranking a handle.

6. Charles Babbage: Difference Engine

Charles Babbage realized that long computations consisted of operations that were regularly repeated. He decided to build a difference engine that would be fully automatic, steam driven, and print tables of numbers. When that failed, Babbage decided to build an analytical engine. Analytical engine was a real parallel decimal computer which would operate on words of 50 decimals and was able to store 1000 such numbers.

7. Herman Hollerith

In 1890 Herman Hollerith developed a device which would read census information which had been punched into cards. Stacks of punched cards could be used as an accessible memory store of almost unlimited capacity. He started his own company to market this device. This company came to be known as International Business Machines (IBM).

8. Binary Representation

In 1941, Konrad Zuse developed a calculating machine designed to solve complex engineering equations. The machine was called Z3 and was controlled by perforated strips of movie film. It used binary representation of numbers.

9. Harvard Mark I

Howard Aiken and engineers at IBM developed a mechanical digital computer, the Harvard Mark I that used pre-punched paper tape. It handled 23 decimal place numbers and could perform all four arithmetic operations.

10. Alan Turing

Alan Turing wrote a paper in 1936 On Computable Numbers in which he described a hypothetical device - a Turing machine. The Turing machine was envisioned to perform logical operations and could read, write, or erase symbols written on squares of an infinite paper tape. This kind of machine came to be known as Finite State Machine. At each step of the computation, the machine's next action was determined from a finite list of possible states. Turing's purpose was not to invent a computer, but rather to describe problems which are logically possible to solve. The Turing machine has some characteristics with modern day computers - the infinite tape can be seen as the internal memory of the computer that one can read, write, or erase. Another landmark paper by Alan Turing was Computing Machinery and Intelligence where he explores the question Can machines think?

11. ENIAC

At the University of Pennsylvania, John W. Mauchly and J. Presper Eckert developed ENIAC (Electrical Numerical Integrator and Computer) that used a word of 10 decimal digits instead of binary digits. It had nearly 18,000 vacuum tubes. It had punched card input and output. It was "programmable" by rewiring the machine.

12. John von Neumann

John von Neumann showed that a computer could have a simple, fixed structure and yet be able to execute any kind of computation given properly programmed control without the need for hardware modification. This special type of machine instruction called conditional control transfer permitted the program sequence to be interrupted and reinitiated at any point. The program instructions were stored with the data in the same memory unit. Instructions could be modified the same way as data.

As a result of these techniques computing and programming became faster, more flexible, and more efficient. Instructions in subroutines did most of the work. Frequently used subroutines were kept in libraries. The first generation of modern programmed electronic computers to take advantage of these improvements appeared in 1947.

These computers used Random Access Memory. Input and output were done through punched cards and programming was done in machine language. These groups of machines included EDVAC and UNIVAC, the first commercially available computers.

13. EDVAC Computer

EDVAC (Electronic Discrete Variable Automatic Computer) was a vast improvement over the ENIAC. The program was stored inside the computer. EDVAC had more internal memory provided through use of mercury delay lines. EDVAC used binary numbers rather than decimal, thus simplifying the construction of the arithmetic unit.

14. Technology Advances

Two devices that were invented in the 1950s would revolutionize the field of computer engineering. The first of these devices was the transistor invented in 1947 by John Bardeen, Walter Brattain, and William Shockley of Bell Labs. Previously vacuum tubes were used in computers. Vacuum tubes had a heated filament as the source of electrons. Vacuum tubes generated a lot of heat and burnt out quite frequently. The ENIAC had around 18000 vacuum tubes.

But transistors had their problems too. Like other electronic components they needed to be soldered. The more complex the circuit became the more complicated became the connections between the individual transistors and the likelihood of faulty wiring increased. In 1958, Jack St. Clair Kilby of Texas Instruments manufactured the first integrated circuit or chip. A chip is collection of tiny transistors that are connected together when the transistor is manufactured. Thus the need to solder large numbers of transistors was obviated. This not only saved space but increased the speed of the machine.

15. Altair

In 1971, Intel released the first microprocessor. This was a special integrated circuit that was able to process four bits of data at a time. A company called Micro Instrumentation and Telemetry Systems (MITS) started marketing a kit called Altair 8800 for $397. Consumers could assemble the machine and program the machine by manually flipping switches located on the front of Altair. There was no software and the user had to write his own.

16. Creation of Microsoft

BASIC (Beginner's All Purpose Symbolic Instruction Code) was developed by Kemeny & Kurtz in 1964, two mathematicians at Dartmouth. Two programmers decided to write a BASIC interpreter for Altair. They contacted Ed Roberts, the owner of MITS who agreed to pay for it. The two programmers were William Gates and Paul Allen. They later went on to form Microsoft and produce BASIC and operating systems for various machines.

17. BASIC & Other Languages

BASIC was not the only computer language around. There was FORTRAN that was developed in the 1950s by IBM programmers. FORTRAN was a high level language that allowed one to perform scientific computation easily. Another language that was developed at this time was ALGOL (Algorithmic Language) that was intended to be a universal, machine-independent language but was not that successful. A derivative of ALGOL, ALGOL-60 came to be known as C, the language of choice for systems programmers.

In the 1960s COBOL (Common Business Oriented Language) was developed to produce applications for the business world. Also, in the 60s, Niklaus Wirth, a Swiss computer scientist released Pascal as teaching language for beginning computer students. It forced programmers to develop a structured approach to programming. Niklaus Wirth later followed Pascal with Modula-II that was similar to Pascal in structure and syntax.

18. PC Explosion

There was an explosion of personal computers after the introduction of the Altair. Steve Jobs and Steve Wozniak introduced the Apple II. The Apple II had built-in BASIC, color graphics, and 4100 character memory for only $1298.

Tandy Radio Shack put the TRS-80 on the market in 1977. It later came out with TRS-80 Model II that had 64,000 character memory and a disk drive to store programs and data on. Personal computer applications took off as a floppy disk was one of the most convenient publishing medium for distribution of software.

IBM which was producing main frames and minicomputers came out with IBM PC, a small computer for the home market. It was modular and was built with many of its parts that were manufactured outside of IBM.

In 1984, Apple released the first generation Macintosh which was a computer to come with a graphical user interface and a mouse. It was easy to use and became a favorite with home users. IBM released 286-AT, which with applications like Lotus 1-2-3, a spreadsheet program, and Microsoft Word quickly captured the small business market.