The history of computer development is often referred to in reference to the different generations of computing devices. Each generation of computer is characterized by a major technological development that fundamentally changed the way computers operate, resulting in increasingly smaller, cheaper, more powerful, efficient and reliable devices. Nowadays, the computers/IT technology are very appropriates and means a lot to the world, because all the sort of technological things came into the computer processing technology but still need a lot to do in the scientist world such as researching missions. All the computer generations and each of its brief descriptions are given with its period of times below:
First Generation - 1940-1956: Vacuum Tubes
The first generation computers used vacuum tubes for circuitry and magnetic drums for memory. They were often huge (occupying entire rooms), very expensive to operate, using a great deal of electricity and generated a lot of heat (which was often the cause of malfunctions). First generation computers relied on machine language to perform operations, and they could only solve one problem at a time. Input was based on punched cards and paper tape, and output was displayed on printouts.
Examples of first generation computers: UNIVAC (Universal Automatic Computer), ENIAC (Electronic Numerical Integrator and Computer). The UNIVAC was the first commercial computer delivered to a business client, the U.S. Census Bureau in 1951. The example of the first generation computer's picture is below:
Second Generation - 1956-1963: Transistors
Transistors replaced vacuum tubes and unshared in the second generation of computers. The transistor was invented in 1947 but did not see widespread use in computers until the late 50s. The transistor was far superior to the vacuum tube, allowing computers to become smaller, faster, cheaper, more energy-efficient and more reliable than their first-generation predecessors. Though the transistor still generated a great deal of heat that subjected the computer to damage, it was a vast improvement over the vacuum tubes. Second-generation computers were relied on punched cards for input and printouts for output.
Second-generation computers moved from cryptic binary machine language to symbolic, or assembly, languages, which allowed programmers to specify instructions in words. High-level programming languages were also being developed at this time, such as early versions of COBOL and FORTRAN. These were also the first computers that stored their instructions in their memory, which moved from a magnetic drum to magnetic core technology. The first computers of this generation were developed for the atomic energy industry. The second generation's modifier/transistor is given with its own shape below:
Third Generation - 1964-1971: Integrated Circuits
The development of the integrated circuit was the hallmark of the third generation of computers. Transistors were miniaturized and placed on silicon chips, called semiconductors, which drastically increased the speed and efficiency of computers. Instead of punched cards and printouts, users interacted with third generation computers through keyboards and monitors and interfaced with an operating system, which allowed the device to run many different applications at one time with a central program that monitored the memory. Computers for the first time became accessible to a mass audience because they were smaller and cheaper than their predecessors. The third generation's computers are fulfilled with a lots of technological at that time and they were come as a mass computer to the world. The generation's producer/IC is shown as below:
Fourth Generation - 1971-Present: Microprocessors
The microprocessor brought the fourth generation of computers, as thousands of integrated circuits were built onto a single silicon chip. What in the first generation filled an entire room could now fit in the palm of the hand. The Intel 4004 chip, developed in 1971, located all the components of the computer - from the central processing unit and memory to input/output controls - on a single chip.
In 1981, IBM introduced its first computer for the home user, and in 1984 Apple introduced the Macintosh. Microprocessors also moved into many areas of life as more and more everyday products began to use microprocessors. As these small computers became more powerful, they could be linked together to form networks, which eventually led to the development of the Internet. Fourth generation computer also saw the development of GUIs- (Graphical User Interface), the mouse and handheld devices. The microprocessor's and its computer pictures are below:
Fifth Generation - Present and Beyond: Artificial Intelligence
Fifth generation computing devices, based on artificial intelligence, are still in development, though there are some applications, such as voice recognition, that are being used today. The use of parallel processing and superconductors is helping to make artificial intelligence a reality. Quantum computation, and molecular and nanotechnology will radically change the face of computers in years to come. The goal of fifth-generation computing is to develop devices that respond to natural language input and are capable of learning and self-organization.
The example of artificial intelligence means to robotic technology, that it is working itself with programmed instructions in the memory of the robot. The very new technology invents is core-i processor to the modern computer world and it is being used at present. The fifth generation's technological evidence of pictures are below: