There were several systems for representing numbers. The Romans used symbols for certain numbers like so:
John Napier (1550-1617), a Scottish mathematician, created logarithm tables to facilitate calculations. He also created a device using rods, also called Napier's bones, to perform aritmetical calculations. These rods were widely used by accountants and bookkeepers.
Several people used the concept of logarithms to develop the slide rule. In particular, mention must be made of a French artillery officer Amedee Mannheim (1831-1906) who introduced the movable double sided cursor on the slide rule. With a modern slide rule you could not only perform the arithmetical operations, you could also calculate squares, square roots, logs, sine, cosine, and tangent calculations. The slide rule was used till the middle 70's.
Pascal invented a machine that had a system of gears. A one-tooth gear engages its single tooth with a ten-tooth gear once every time it revolves. It must make ten revolutions to rotate the ten-tooth gear once. Numbers could be entered and cumulative sums obtained by cranking a handle. Pascal's calculator could handle the carry digit in addition but could not subtract. His calculator was not a commercial success because these devices could not be built with sufficient precision for practical use.
The German mathematician, von Leibniz, studied Pascal's designs and produced a machine that could multiply. The machine consisted of a setup mechanism to enter the digits of the multiplicand, a handle to crank for each digit of the multiplier, a result register, and a system of gears to facilitate the computation. His calculator used the shift-and-add procedure used for multiplication on present day digital computers. The problems that Leibniz faced in the construction of his calculator were common to inventors of his time - poor materials and poor workmanship. In later years other mechanical calculators followed that were refinements on the designs of Pascal and Leibniz.
Charles Babbage realized that long computations consisted of operations that were regularly repeated. He decided to build a difference engine that would be fully automatic, steam driven, and print tables of numbers. This machine was designed to evaluate polynomials for the preparation of mathematical tables. Unfortunately, the metal-working technology of his time was not sufficiently advanced to manufacture the required precision gears and linkages.
When that failed, Babbage decided to build an analytical engine. Analytical engine was a real parallel decimal computer which would operate on words of 50 decimals and was able to store 1000 such numbers. The machine had several components - components to store data and intermediate results, components for input and output of information, and for the transfer of information between components. The operation of this machine was through punched cards. Babbage's analytical engine was never built in his lifetime but several of his concepts were used in the design of latter day computers.
Logic machines did not have any practical significance but they reinforced the relationship between logic and computing. They also paved the way for an important theoretical paper on computing. In 1936, Alan Turing wrote a paper On Computable Numbers in which he described a hypothetical device - a Turing machine. The Turing machine was envisioned to perform logical operations and could read, write, or erase symbols written on squares of an infinite paper tape. This kind of machine came to be known as Finite State Machine. At each step of the computation, the machine's next action was determined from a finite list of possible states. Turing's purpose was not to invent a computer, but rather to describe problems which are logically possible to solve. The Turing machine has some characteristics with modern day computers - the infinite tape can be seen as the internal memory of the computer that one can read, write, or erase. Another landmark paper by Alan Turing was Computing Machinery and Intelligence where he explores the question Can machines think?
Close correspondence between circuits and logic was first suggested in Russian literature by Paul Ehrenfest in 1910. This was followed by work done in 1934 by V.I.S. Sestakov and in 1936 in Japan by Akira Nakasima and Masao Hanzawa. However, the paper that received the most attention was by Claude Shannon in 1938 for his master's thesis at MIT.
In the early twentieth century mechanical calculators were being replaced by electrical ones. These machines used electric circuits and motors to do complex calculations. The key element in these calculators was the electromagnetic relay. The relay was basically a switch that allowed an electric current to pass through when it received a signal. (Early telegraph and telephone devices used relays to transmit information.) In the mid 1930's relays were used by at least three experimenters in the building of electro-mechanical calculators. They were Konrad Zuse in Berlin, George Stibitz in New York, and Howard Aiken in Cambridge, MA.
In a radical departure from other developers of calculating machines, Konrad Zuse used binary representation of numbers in the internal computation of his machine that was designed to solve complex engineering equations. In 1941, he completed the Z3 that used 1800 relays to store sixty-four 22-digit binary numbers and there were 600 additional relays for calculating and control units. Instructions were fed to the computer on perforated 35-mm movie film.
George Stibitz, working as a research mathematician at Bell Labs had little knowledge of Konrad Zuse's work. In 1939 he built a Complex Number Computer that performed multiplication and division on complex numbers. The novelty of this computer was that it was accessed remotely using a teletype machine. After United States entered the Second World War, Bell Labs became more interested in problems that had immediate applications in defense. They built five digital relay computers for the military. The largest computer in this series was Model V that contained 9,000 relays and handled numbers expressed in scientific notation. Model V was a general purpose calculator and solved a variety of numerical problems. The program and data were fed to the computer using paper tapes.
Howard Aiken and engineers at IBM developed a mechanical digital computer in 1944, the Harvard Mark I. It was 51 feet long, 8 feet tall, and only 2 feet deep. There was drive shaft that ran along the base. This machine used relay circuits for internal computation and punched paper tape for instructions and data. It handled 23 digits decimal numbers and could perform all four arithmetic operations. The machine was used by the United States Navy for doing classified work. One of the programmers was a recently commissioned Naval officer Grace Hopper. It was she who found a moth trapped between two relay circuits causing the machine to malfunction. She removed the moth and attached it to her logbook noting that she found the bug that was causing the problem!
There several generations of the Harvard Mark computers. But by the end of the forties engineers realized that they had reached the limits of the relay circuit technology and that a switch needed to be made to vacuum tubes.
The first fully electronic computer was developed by John Atanasoff at Iowa State University with the help of his assistant Clifford Berry. It used capacitors to store numbers. The capacitors had to be refreshed because of charge leakage. This was a forerunner of the concept of dynamic memory of modern computers. It was designed to solve linear systems of equations but intermittent malfunctioning prevented it from being used regularly.
Mention must be made of special purpose calculators that used vacuum tubes circuitry. One was developed in Germany in 1941 by Helmut Schreyer (a friend of Konrad Zuse) to three-digit decimal numbers to and from binary. Another special purpose electronic calculator was developed in England at Bletchley Park called the Colossus. The Colossus was a joint effort by many people. The Colossus made Boolean comparisons between two strings and was used specifically to decode German messages.
At the University of Pennsylvania, John W. Mauchly and J. Presper Eckert developed ENIAC (Electrical Numerical Integrator and Computer) that used a word of 10 decimal digits instead of binary digits. It had nearly 18,000 vacuum tubes. It had punched card input and output. It was "programmable" by rewiring the machine.
As a result of these techniques computing and programming became faster, more flexible, and more efficient. Instructions in subroutines did most of the work. Frequently used subroutines were kept in libraries. The first generation of modern programmed electronic computers to take advantage of these improvements appeared in 1947.
These computers used Random Access Memory. Input and output were done through punched cards and programming was done in machine language. These groups of machines included EDVAC and UNIVAC, the first commercially available computers. EDVAC (Electronic Discrete Variable Automatic Computer) was a vast improvement over the ENIAC. The program was stored inside the computer. EDVAC had more internal memory provided through use of mercury delay lines. EDVAC used binary numbers rather than decimal, thus simplifying the construction of the arithmetic unit.
But transistors had their problems too. Like other electronic components they needed to be soldered. The more complex the circuit became the more complicated became the connections between the individual transistors and the likelihood of faulty wiring increased. In 1958, Jack St. Clair Kilby of Texas Instruments manufactured the first integrated circuit or chip. A chip is a collection of tiny transistors that are connected together when the transistor is manufactured. Thus the need to solder large numbers of transistors was obviated. This not only saved space but increased the speed of the machine.
BASIC (Beginner's All Purpose Symbolic Instruction Code) was developed by Kemeny & Kurtz in 1964, two mathematicians at Dartmouth. Two programmers decided to write a BASIC interpreter for Altair. They contacted Ed Roberts, the owner of MITS who agreed to pay for it. The two programmers were William Gates and Paul Allen. They later went on to form Microsoft and produce BASIC and operating systems for various machines.
BASIC was not the only computer language around. There was FORTRAN that was developed in the 1950s by IBM programmers. FORTRAN was a high level language that allowed one to perform scientific computation easily. Another language that was developed at this time was ALGOL (Algorithmic Language) that was intended to be a universal, machine-independent language but was not that successful. A derivative of ALGOL, ALGOL-60 came to be known as C, the language of choice for systems programmers.
In the 1960s COBOL (Common Business Oriented Language) was developed to produce applications for the business world. Also, in the 60s, Niklaus Wirth, a Swiss computer scientist released Pascal as teaching language for beginning computer students. It forced programmers to develop a structured approach to programming. Niklaus Wirth later followed Pascal with Modula-II that was similar to Pascal in structure and syntax.
Tandy Radio Shack put the TRS-80 on the market in 1977. It later came out with TRS-80 Model II that had 64,000 character memory and a disk drive to store programs and data on. Personal computer applications took off as a floppy disk was one of the most convenient publishing medium for distribution of software.
IBM which was producing main frames and minicomputers came out with IBM PC, a small computer for the home market. It was modular and was built with many of its parts that were manufactured outside of IBM.
In 1984, Apple released the first generation Macintosh which was a computer to come with a graphical user interface and a mouse. It was easy to use and became a favorite with home users. IBM released 286-AT, which with applications like Lotus 1-2-3, a spreadsheet program, and Microsoft Word quickly captured the small business market. An account of the holy war between Microsoft and Apple and an enlightening social commentary read Neal Stephenson's article In the Beginning was the Command Line.