Get premium membership and access questions with answers, video lessons as well as revision papers.
Got a question or eager to learn? Discover limitless learning on WhatsApp now - Start Now!

The Generations of Computers since 1940 to Present

  

Date Posted: 3/19/2018 3:22:30 PM

Posted By: Bom  Membership Level: Bronze  Total Points: 57


The computer generations are the stages in the evolution of electronic circuitry, hardware, software, programming languages and other technological developments.The first electronic computers were made in 1940's.Since then a series of radical breakthroughs in electronic have occurred.

These include:
First Generation.(1940- 1956) – Vaccum Tubes-
These computers used vacuum tubes as their electrical switching devices. Their CPU speeds were very low. The input devices were paper tapes or punched cards. Electronic typewriters, programmed to type by a paper tape or punched card reader were used for printing reports. They had between 1K and 4K of RAM. The computers received its instructions in machine language or electrical on/off signals. There were no programming languages.

The application software available was tabulating, now called spreadsheets. Since computers could only perform one task a time, the computer work was done in batches thus the operating system was called batch processing systems in 1950’s.

Second Generation (1956 - 1964) - Transistors

These used transistors, which were much smaller cooler and reliable. Processing speed has improved by a factor of five.

They utilized keyboards and video display monitors. The first light pen was used as an input device for drawing on the face of the monitor. High-speed printer came into use.

RAM grew from 4K to 32K, making it possible for the computer to hold more data and instructions. Use of magnetic tapes and disks was introduced to replace permanent storage on computer cards. The IBM 1401 didn’t have an operating system; instead it used a special language called Symbolic Programming System (SPS) to create programs. This generation marked the common use of high-level languages. FORTRAN (1957) was used for scientific purposes and COBOL (1961) for business purposes. There were also improvements in system software.

Almost every computer had its unique operating system, programming language and application software.

The Third Generation (1964-1971) – Integrated circuits

Started with the introduction of IBM 360 in about 1960s which used integrated circuits (a number of electrical components on a single slice of silicon) termed as hybrid integrated technology where separate transistors and diodes were inserted into circuits.

There were several improvements such as:-
Increased processing speeds
Increased accuracy.
Integration of hardware and software.
The ability to perform several operations simultaneously
Data communication advances.
Many high level programming languages were developed among them BASIC and Pascal. IBM created os/360 operating system. Software growth enhanced due to unbundling, or selling the software separate from the hardware.

The Fourth Generation (1971 - 1988) - Microprocessors

Large-scale integration, a technique for packing more and more circuitry in a single chip was developed. “4th generation brought major advances in 2nd generation mainframes, in 3rd generation minicomputers and added a brand new category of machine; the microcomputer or personal computer. There was dramatic increase in processor speed. The keyboard and the video monitor have become standard 1/0 devices. The mouse began playing a major role.

There was introduction of fourth - generation languages programs.


The Fifth Generation (1983-present) – Artificial Intelligence

Super chip development is truly at the Center of the fifth generation. (A chip is a thin piece of silicon on which electronic components are etched). Much advancement is still going on e.g. use of object-oriented languages, artificial intelligence.
The use of parallel processing and superconductors is helping to make artificial intelligence a reality. The goal of fifth-generation computing is to develop devices that respond to natural language input and are capable of learning and self-organization.



Next: Types, effects and symptoms of computer virus
Previous: Benefits of Biodiversity Conservation

More Resources
Quick Links
Kenyaplex On Facebook


Kenyaplex Learning