GENERATIONS OF COMPUTERS

FIRST GENERATION - VACUUM TUBES (1940 - 1956)

The first computers used vacuum tubes for circuitry and magnetic drums for memory, and were often enormous, taking up entire rooms. They were very expensive to operate and in addition to using a great deal of electricity, generated a lot of heat, which was often the cause of malfunctions.

First generation computers relied on machine language, the lowest-level programming language understood by computers, to perform operations, and they could only solve one problem at a time. Input was based on punched cards and paper tape, and output was displayed on printouts.

First Generation computers are characterised by the use of vacuum tubes. These vacuum tubes were used for calculation as well as storage and control. Later, magnetic tapes and magnetic drums were implemented as storage media. The first vacuum tube computer, ENIAC, was developed by US army ordinance to calculate ballistic firing tables in WWII. It had about 17 000 vacuum tubes. The machine weighed 30 tons, covered about 1000 square feet of floor, and consumed 130 or 140 kilowatts of electricity. The ENIAC's clock speed was about 100 kHz. In addition to ballistics, the ENIAC's field of application included weather prediction, atomic-energy calculations, cosmic-ray studies, thermal ignition, random-number studies, wind-tunnel design, and other scientific uses. No electronic computers were being applied to commercial problems until about 1951.

 

This is an example of a vacuum tube based circuit used in a first generation computer (a Burroughs), pictured here next to a transistor based circuit, with similar functionality, from a second generation computer (the IBM 1620). The vacuum tubes (at the top of the circuit) have been damaged because of overheating. We suspect that this particular circuit is a 4-bit register. Circuits created i this way were extremely bulky. To create a 32-bit ADD circuit would require 800 logic gates using a total of 1,504 transistors. In vacuum tube based computers, this many vacuum tubes would take up a space about the size of a refrigerator.
 

picture of vacuum tube circuit

Vacuum tube circuit

Vacuum tube

picture of vacuum tube

This is a small vacuum tube, used in first generation computers. Here you can clearly see the effect of overheating, leaving a black stain on the inside of the glass tube. Constant overheating and burnout in the vacuum tubes of ENIAC, the first electronic computing device, in 1947 led AT&T Bell Telephone Laboratory engineers John Bardeen, William Shockley, and Walter Brattain to seek out a suitable alternative for the commercially unreliable vacuum tube. The three successfully demonstrated the principle of amplifying an electrical current using a solid semiconducting material, silicon, forming the basic concept behind the transistor.

 

SECOND GENERATION - TRANSISTORS(1956 - 1963)


Transistors replaced vacuum tubes and ushered in the second generation of computers. The transistor was invented in 1947 but did not see widespread use in computers until the late 1950s. The transistor was far superior to the vacuum tube, allowing computers to become smaller, faster, cheaper, more energy-efficient and more reliable than their first-generation predecessors. Though the transistor still generated a great deal of heat that subjected the computer to damage, it was a vast improvement over the vacuum tube. Second-generation computers still relied on punched cards for input and printouts for output.

Second-generation computers moved from cryptic binary machine language to symbolic, or assembly, languages, which allowed programmers to specify instructions in words. High-level programming languages were also being developed at this time, such as early versions of COBOL and FORTRAN. These were also the first computers that stored their instructions in their memory, which moved from a magnetic drum to magnetic core technology. The first computers of this generation were developed for the atomic energy industry.

THIRD GENERATION - Integrated Circuits (1964 - 1971)

The development of the integrated circuit was the hallmark of the third generation of computers. Transistors were miniaturized and placed on silicon chips, called semiconductors, which drastically increased the speed and efficiency of computers.

Instead of punched cards and printouts, users interacted with third generation computers through keyboards and monitors and interfaced with an operating system, which allowed the device to run many different applications at one time with a central program that monitored the memory. Computers for the first time became accessible to a mass audience because they were smaller and cheaper than their predecessors.

Integrated circuits, otherwise knows as chips, or microchips, first successfully created by Jack Kilby were small silicon chip with active devices on its surface called transistors, replaced bulky vacuum tubes in computers allowing them to contain much more information within a much smaller size. In the beginning the chips had a very small number of transistors, but the technology improved and now over a million transistors can fit on a single chip, allowing for much faster productivity in a very small space. It was because of the microchip that computers were able to transform from room sized boxes that could do little more than add and subtract numbers, to the computers of today.

FOURTH GENERATION - Micro Processors (1971)

The microprocessor brought the fourth generation of computers, as thousands of integrated circuits were built onto a single silicon chip. What in the first generation filled an entire room could now fit in the palm of the hand. The Intel 4004 chip, developed in 1971, located all the components of the computer—from the central processing unit and memory to input/output controls—on a single chip.

In 1981 IBM introduced its first computer for the home user, and in 1984 Apple introduced the Macintosh. Microprocessors also moved out of the realm of desktop computers and into many areas of life as more and more everyday products began to use microprocessors.

As these small computers became more powerful, they could be linked together to form networks, which eventually led to the development of the Internet. Fourth generation computers also saw the development of GUIs, the mouse and handheld devices.

Fourth Generation computers are the modern day computers. The Size started to go down with the improvement in the integerated circuits. Very Large Scale(VLSI) and Ultra Large scale(ULSI) ensured that millions of components could be fit into a small chip. It reduced the size and price of the computers at the same time increasing power, efficiency and reliability. "The Intel 4004 chip, developed in 1971, took the integrated circuit one step further by locating all the components of a computer (central processing unit, memory, and input and output controls) on a minuscule chip."

FIFTH GENERATION - Artificial Intelligence (AI) (1971)

Fifth generation computing devices, based on artificial intelligence, are still in development, though there are some applications, such as voice recognition, that are being used today. The use of parallel processing and superconductors is helping to make artificial intelligence a reality. Quantum computation and molecular and nanotechnology will radically change the face of computers in years to come. The goal of fifth-generation computing is to develop devices that respond to natural language input and are capable of learning and self-organization.

The fifth generation of modern computers are still in development, but they are based on artificial intelligence and parallel processing. Large scale integration technology (VLSI) has allowed more circuits to be put onto a single chip, and with advancements in software and hardware, the potential for even stronger, faster machines is extremely high.

As computers are beginning to develop the ability to ‘think’ - i.e.. Come to logical conclusions, the way that they interact with users is set to change. They have begin to recognize spoken voice and personal identifiers such as fingerprints or a person’s facial profile, and stored such data in the now vast databases they all exhibit.

The goal for the fifth generation of modern computers is for computers to be able to recognize natural language and respond to it, and be capable of self learning and organizing, i.e.. See things the way that humans do.

[HOME]  [2]  [3]  [4]  [5]  [6]  [7]  [8]  [9]  [10]  [11]  [12]  [13]  [14]  [15]  [16]  [17]  [18]  [19]  [20]  [21]  [22]  [23]  [24]  [25]  [26]  [27]  [28]  [29] [30]  [31]  [32]  [33] 

<<[PREVIOUS]                   [NEXT]>>

GENERATION

FIRST

SECOND

THIRD

FOURTH

Time Frame

1940 - 1956

1956 - 1963

1964 - 1971

1971

Technology

Vacuum Tube

Transistors

Integrated  Circuits

Micro Processor

(LSI. VLSI)

Type of  Computer

Mainframes

Mainframes

Mainframes

Mini Computer

Mainframes

Mini Computer

Micro Computer

External Storage

Card

Tape

Magnetic Storage

Mass Storage

Operating  System

Single user  job

Scheduled

Manually

Single user - Job

Scheduled

Automatically.

Multiple users -

Time sharing

 

Multiple users

Distributed

systems