The History of Computing
March 1, 1988

The Chinese were really the first group of people to use a device other than the human brain to calculate numbers. They used a device called the abacus, and this was invented over 15,000 years ago.

Blaise Pascal was the next person to make a significant contribution to computing. His device was the adding machine; although not classified as a computer it does represent the use of a machine to aid mankind. Called the Pascaline, it worked by the user dialing up a set of numbers to be added and watching the result be displayed in a window up the top of the device. Inside was a set of numbers of cogs and wheels, and although it may sound technical, it was not a computer because it could not act upon a set of conditions or be programmed to another task. The Pascaline was invented in 1642.

At this point computing took a real change. The date was 1791, and Charles Babbage had just invented a machine that had all the modern-day characteristics of computers; I/O, a processor, control unit and a memory unit. He was both a mathematician and an inventor, and his idea came from a problem that had arisen when solving polynomial equations. He decided that a machine could be built that could solve these numbers for him, and so he set about building a prototype of it, calling it the Difference Engine. on completion, the model was highly successful and he set about building a full-scale working version.

Babbage was frustrated by the fragility and instability of the machine; the slightest error in any part of the manufacture would put it out of commission. It must be remembered that Babbage was using 19th Century manufacturing techniques, and these were not all that accurate. The idea was right, but at that time it was not possible to construct the machine.

Finally, after several years and 17,000 English pounds, Babbage was forced to halt work. However, he was not discouraged. He set about building another machine, this one infinitely more complex. It would have been able to do more than one type of calculation, but it was never built. A 20th Century reproduction of the plans proved that if it had been built, it would have worked.

In between the development of these two machines, a Frenchman, Joseph Jacquard, invented a system where a block of wood had a series of holes in a certain pattern punched in it; the wood blocked threads in a loom from being used.

Babbage used this decision making idea in his machine. This was what made the second machine, or Analytical Engine, so special.

In the U.S.A. in 1880, the annual census was taken, and in 1888 the results were published. They had taken 7.5 years to complete because all the tabulating was done by hand. There was then some doubt as to whether the 1890 census figures would be out by 1900. A competition was held to find the fastest way of speeding up the counting process. The competition winner produced a machine that counted the entire American population of 62,622,250 in just six weeks.

Herman Hollerith was the inventor, and he also used Jacquard's punched card system. When a rod passed through a hole it completed an electrical circuit and thus made a counter advance one digit. This is the main theoretical difference between Babbage's and Hollerith's machines; one was electrical and the other was mechanical.

Hollerith realised the market potential of the punched card system, and so he founded the Tabulating Machine Company. In 1924 the successor to the company merged with two other companies to form International Business Machines, or IBM. This same company now has a turnover of over two billion US dollars per year, and is the world's biggest computer company.

IBM had, among several others, a brilliant scientist among it's ranks. This man was Howard Aiken, and he was the leader of team that developed the Harvard Mark I. This was an electrical computing device, built in 1944. It was 1.7 metres high and 18 metres long, and when working "made the sound of a roomful of old ladies knitting with steel needles". The Mark I was supposed to help the U.S. Navy in its calculations, but was actually never really all that efficient. However, it had a great moral and publicity value, and this machine helped IBM on into the general business market, and strengthened their pledge toward the development of computers.

During World War II, the U.S. Army asked a Dr John Mauchly to build them a machine that would calculate the trajectories for artillery and missiles. The machine the Army was asking for was required to cut the calculation time from 15 minutes to 30 seconds, and the only way that Mauchly and his assistant, Eckert, saw how to do this was 18,000 vacuum tubes, all working simultaneously. And to do this they needed $400,000. Funding was approved.

The ENIAC, as it was called, or "Electronic Numerical Integrator and Computer", was not build in time to help the Army during the war (it was completed in February 1946 - even when the team worked on it 24 hours a day for 2 and a half years), but it was finally finished, and when unveiled it took up 1,500 square feet, was two stories high, weighed 30 tons and used 140,000 watts of power. And it could multiply a pair of numbers in approximately 3,000ths of a second.

However, the ENIAC created a huge amount of heat (from all the vacuum tubes), the memory of the thing was very small, and each time a new program needed to be run the whole setup needed to be re-wired.

This last problem was fixed by Dr John von Neumann. The Army asked the ENIAC development team to build a more powerful computer than ENIAC, and so von Neumann responded with the EDVAC. The Electronic Discrete Variable Automatic Computer was a major change from the now-primitive ENIAC; it could store all its programs, and when a new one was required it simply read the new program from the storage medium.

GENERATIONS

The First Generation - vacuum tubes

The first generation of computers began in 1951, with the release of the UNIVAC, or Universal Automatic Computer, which was sold to the U.S. Bureau of the Census (also the user of the Hollerith machine) on June 14, 1951. The UNIVAC used vacuum tubes to operate, as did the ENIAC. Also, the UNIVAC faced the same problems as the ENIAC, except that it was programmable.

The UNIVAC use a magnetic core memory. This is a method of memory which uses a 'necklace' of pinhead-sized doughnut-shaped rings which hung on overlapping and intersecting wires. This type of memory remained state-of-the-art for 20 years, and in 1957 secondary erasable memory arrived with the introduction of magtape.

The Second Generation - transistors

The second generation arrived in 1959, with the introduction of the transistor (derived from transfer and resistor). Three scientists, J Bardeen, H Brattain, and W Shockley developed it, and received the Nobel Prize for their trouble. The transistor was revolutionary; it was minute compared to the vacuum tube, the consumed less energy, needed no start-up time, they were faster and more reliable.

Also during this period was the creation of assembly languages. These are ways of programming the computer, and were a step up the ladder from the older and more cumbersome machine language. Machine language used numbers to tell the computer something; Assembly languages used symbols (this is why they are also called symbolic languages) to tell the computer.

After this came the use of higher-level languages, such as FORTRAN and COBOL. This made it easier on the programmer, because he did not have to worry about coding his program into machine or assembly language; the language did it for him. BASIC is also a higher-level language.

Another development during the second generation was the disk pack. This is essentially an entire cluster (usually about five) of hard disks all stacked upon one another. Because hard disks have a very fast access time, this new invention was very important, and many people started upgrading from their magtape units to disk units. Also in this period was the real start of the business end of computing; this is when the computer moved out of the lab and into the marketplace.

The Third Generation - integrated circuits

This generation of computers has become somewhat confused with the so-called fourth generation. It started in 1965, when the integrated circuit was released and began to take over the market. It was new, outstanding technology, faster, more reliable, and could replace an entire circuit board.

ICs, or silicon chips, are made from what their name suggests; silicon. They work by letting electricity flow where the silicon has been 'doped' with a chemical. This is why chips are also called semiconductors.

ICs were such a major development because they put all the components of an entire circuit board onto a sliver of wafer often less than a centimetre square. This means that the computer could 'shrink' from major proportions down to minor.

Another advantage of the chips was that they were very economical to produce. Once the first one had been developed, that is, the pattern created, the chips were very cheap to produce. The factor that forces chip prices up is the cost of the deveopment of the chip, along with the hunger for profit.

The third generation started in 1965, with the release of the IBM System/360. This was a complete office integration system, came in several models and sizes, there were about 40 different I/O devices, and they were all compatible. This enabled users to tailor-make a system to suit their budget, and for this reason the System/360 was a smash-hit.

At this time software advanced too. True multi-tasking was available; that is, more than one program could run at one time, and this boosted productivity in the office no end. Large computers began to be used by minicomputers, a sort of bridge in between the user and the mainframe. Minicomputers are eqivalent to mainframes in function, but they are considerably smaller, slower, frendlier, and they cost less.

The Fourth Generation - Large Scale Integration

This is where the Fourth Generation is supposed to start. This generation was actually an extension of third generation technology; LSI had been developed (Large Scale Integration) on chips, and the chips chip companies such as Intel put out were specialised. That is, each chip produced was for a specific use - memory control, logic control, etc. This is why there is contention as to where each generation starts and finishes - Intel then developed the microprocessor. A microprocessor is literally a computer on a chip. The chip controls memory, logic, I/O, everything. This was released commercially in 1971 by Intel - and was a huge success. Because of their compactness (VLSI was achieved in 1975 -hundreds of thousands of electronic components were all squeezed onto a chip .25 of an inch square) they were faster, they were reliable (no moving parts - in common with 3rd generation chips), and needed no cooling systems, except possible for a small fan inside the 'box' (housing containing the CPU, memory etc.).

This meant that computers could be reduced from their present bulky size down to what we consider normal microcomputer size. Of course, mainframes still remained, but think of their power - if they stayed the same size and there were 1000 times the amount of circuitry in half the space then they could be 2000 times more powerful.

The Fifth Generation - artificial intelligence

And then in 1980 Japan threw everyone out of whack with an announcement of a fifth-generation development project. This was originally intended to define intelligent computers, but has since been expanded to mean also the integration of artificial intelligence and expert systems, tied to a natural language interpreter. AI is simply that; the computer can think abstractly and encompass concepts rather that just plain facts and data. Expert systems are those type of computer software that can give a reccomendation, after being given a set of necessary data.

A natural language interpreter is a program that extracts meaning from normal English speech, via a keyboard. n other words, you can type in data in normal English, and the interpreter will convert this to language the machiine can understand.

PEOPLE IN COMPUTING

NAPIER (1614) - Napier was the first person in the Western world to attempt to use an alternative device (other than the brain) to calculate numbers. The invented logarithms, and with them a device to calculate them. The device was called 'Napier's Rods' or 'Napier's Bones'. This theory has been carried through into mathematics, but there has been no development in the computing field of this.

PASCAL (1642) - Pascal invented a machine to help add numbers. Called the Pascaline, it worked by the user dialing up a set of numbers on a dial at the front, and watching the result of the addition be displayed in the window at the top. Inside was a set of cogs and wheels, and this theory of mechanical addition has been carried through into the odometers of cars and the like. I think that there is future in this type of addition because people will always need devices like the odometer, for example. (Who wants a digital readout on their bicycle speedo ?)

JACQUARD (1801) - Jacquard invented the idea of a coding system to make a certain event occur. His original idea was a block of wood with holes punched in it. This made the decision of whether to put a stitch in the pattern in a loom or not. This idea was tremendous; it signaled the start of the concept of using a code to represent instructions. Babbage and Hollerith (both discussed later) used this concept in each of their own machines. Punched cards (cardboard was substituted for wood as the reading methods became more delicate) were very common as little as seven years ago, but since then there has been an ever- increasing trend toward more modern techniques. I think that this trend will continue - there is no future in punched cards.

BABBAGE (1825) - Babbage is the hailed father of the computer. In summary, he built several devices that calculated or were designed to calculate numbers. For a more dtailed description see Page #1.

REMINGTON (1850) - This man invented typewriters and other low-key computer equipment. The company merged with Rand to become Remington-Rand, and still survives today. Typewriters have become electrified, but I don't think that they will last all that longer. You can't edit what you type - it's final. For this reason I would say that computers, or at least dedicated word-processors, will take over the typewriter eventually.

BURROUGHS (1860) - Burroughs invented adding and calculating machines. He formed a company (named after him) which is still around; they used to sponsor the cricket and did all the computerised scoring for the ACB. Burroughs then purchased Sperry in 1986, and changed the name (no - not to Sparrows!) to UniSys. The company makes all sorts of computer hardware, mostly in the high-end business market. (No PC's)

HOLLERITH (1885) - Hollerith was the first man to sell a real computing idea; he created a tallying machine to add up and collate all the data for the U.S. Census Bureau. The machine used punched cards (Jacquard's idea - see the end of page #5) to input data. The main difference between this and Babbage's idea was that Hollerith's machine used electricity as well as mechanics. Hollerith also formed a company - the Tabulating Machines Company, which later merged with the Time Recording Company and the Dayton Scale Company to form International Business Machines Corp. - IBM. This company is now the world's largest computer firm, with a turnover exceeding two billion U.S. dollars. The future of this company is virtually assured - exculding the possibility of a major executive bungle or a major discovery by another large firm, IBM will remain tops. They have just released a whole series of world-standard machines and a world-standard operating system to match, they have the backing of the world's largest software company, Microsoft (who co-wrote OS/2 - the operating system, together with IBM), and they have already sold two million of their older, outdated PCs. Everyone copies them - they are the world leaders.

AIKEN (1944) - is man headed a team that built IBM's first computer - the Harvard Mark I. This machine was not very efficient but got IBM on the roads to success.

MAUCHLY and ECKERT (1946) - Mauchly and Eckert made the world's first real computer. It was called the ENIAC, had gigantic proportions, and possessed the power of today's hand-held calculator. For a detailed description of this see page #2. The modern-day equivalent of this machine, following the growth path and direction of computers, is the modern computer. That is, the computer that wrote what you are reading now is descended from the ENIAC.

NEUMANN (1948) - Neumann was the first man to make practical the concept of storing programs. Previously, when a new program was to be run, the whole computer had to be rewired. This problem was solved by Neumann, who stored all programs on a storage medium until they were needed, and then loaded them. As far as I can see, this is and will be the only way to make a computer do different things; this is the future.

BRATTAIN, BARDEEN, SHOCKLEY (1959) - These three men headed a team that perfected the transistor. This was a monumental leap forward. For a detailed description of events, see page #3. Transisors are used in computers now; they have been reduced in size astronomically, they are there. I foresee a technological change that will make them obsolete; maybe the development of 'molecular' valves or biological computers.