Thursday, August 16, 2012

Generation of Computer

First Generation (1940-1956): Vacuum Tubes
The first computers used vacuum tubes for circuitry and magnetic drums for memory, and were often enormous, taking up entire rooms. A magnetic drum, also referred to as drum, is a metal cylinder coated with magnetic iron-oxide material on which data and programs can be stored. Magnetic drums were once used as a primary storage device but have since been implemented as auxiliary storage devices.

They were very expensive to operate and in addition to using a great deal of electricity, generated a lot of heat, which was often the cause of malfunctions. First generation computers relied on machine language to perform operations, and they could only solve one problem at a time. Machine languages are the only languages understood by computers. While easily understood by computers, machine languages are almost impossible for humans to use because they consist entirely of numbers. Computer programmers, therefore, use either high level programming languages or an .assembly language programming. An assembly language contains the same instructions as a machine language, but the instructions and variables have names instead of being just numbers. Every CPU has its own unique machine language. Programs must be rewritten or recompiled, therefore, to run on different types of computers. Input was based on punch card and paper tapes, and output was displayed on printouts.

                                           
                                            fig: first generation of computer

The UNIVAC and ENIAC computers are examples of first-generation computing devices. The UNIVAC was the first commercial computer delivered to a business client, the U.S. Census Bureau in 1951. Acronym for Electronic Numerical Integrator and Computer, the world's first operational electronic digital computer, developed by Army Ordnance to compute World War II ballistic firing tables. The ENIAC, weighing 30 tons, using 200 kilowatts of electric power and consisting of 18,000 vacuum tubes,1,500 relays, and hundreds of thousands of resistors, capacitors, and inductors, was completed in 1945. In addition to ballistics, the ENIAC's field of application included weather prediction, atomic-energy calculations, cosmic-ray studies, thermal ignition, random-number studies, wind-tunnel design, and other scientific uses. The ENIAC soon became obsolete as the need arose for faster computing speeds.

Second Generation (1956-1963): Transistors
Transistors replaced vacuum tubes and ushered in the second generation computer. Transistor is a device composed of semiconductor material that amplifies a signal or opens or closes a circuit. Invented in 1947 at Bell Labs, transistors have become the key ingredient of all digital circuits, including computers. Today's latest microprocessor contains tens of millions of microscopic transistors.Prior to the invention of transistors, digital circuits were composed of vacuum tubes, which had many disadvantages. They were much larger, required more energy, dissipated more heat, and were more prone to failures. It's safe to say that without the invention of transistors, computing as we know it today would not be possible.

                                         
                                             fig: second generation of computer

The transistor was invented in 1947 but did not see widespread use in computers until the late 50s. The transistor was far superior to the vacuum tube, allowing computers to become smaller, faster, cheaper, more energy-efficient and more reliable than their first-generation predecessors. Though the transistor still generated a great deal of heat that subjected the computer to damage, it was a vast improvement over the vacuum tube. Second-generation computers still relied on punched cards for input and printouts  for    output. Second-generation computers moved from cryptic binary machine language to symbolic, or assembly, languages, which allowed programmers to specify instructions in words. High-level programming languages were also being developed at this time, such as early versions of COBOL and FORTRAN. These were also the first computers that stored their instructions in their memory, which moved from a magnetic drum to magnetic core technology.

Third Generation (1964-171):Integrated Circuits
The development of the integrated circuit was the hallmark of the third generation of computers. Transistors were miniaturized and placed on silicon chips, called semiconductors, which drastically increased the speed and efficiency of computers.Instead of punched cards and printouts, users interacted with third generation computers through keyboards and monitors and interfaced with an operating system, which allowed the device to run many different applications at one time with a central program that monitored the memory. Computers for the first time became accessible to a mass audience because they were smaller and cheaper than their predecessors.

                               
                                           fig: third generation of computer


Fourth Generation (1971-present)

The microprocessor brought the fourth generation of computers, as thousands of integrated circuits we rebuilt onto a single silicon chip. A silicon chip that contains a CPU. In the world of personal computers,the terms microprocessor and CPU are used interchangeably. At the heart of all personal computers and most workstations sits a microprocessor. Microprocessors also control the logic of almost all digital devices, from clock radios to fuel-injection systems for automobiles.

Three basic characteristics differentiate microprocessors:

  1. Instruction Set: The set of instructions that the microprocessor can execute.
  2. Bandwidth: The number of bits processed in a single instruction.
  3. Clock Speed: Given in megahertz (MHz), the clock speed determines how many instructions per second the processor can execute.


In both cases, the higher the value, the more powerful the CPU. For example, a 32-bit microprocessor that runs at 50MHz is more powerful than a 16-bitmicroprocessor that runs at 25MHz. The CPU is the brains of the computer. Sometimes referred to simply as the processor or central processor, the CPU is where most calculations take place. In terms of computing power,the CPU is the most important element of a computer system. On large machines, CPU's require one or more printed circuit boards. On personal computers and small workstations, the CPU is housed in a single chip called a microprocessor.

Two typical components of a CPU are:

  • The arithmetic logic unit (ALU), which performs arithmetic and logical operations.
  • The control unit, which extracts instructions from memory and decodes and executes them, calling on the ALU when necessary.


In 1981 IBM introduced its first computer for the home user, and in 1984 Apple introduced the Macintosh. Microprocessors also moved out of the realm of desktop computers and into many areas of life as more and more everyday products began to use microprocessors.As these small computers became more powerful, they could be linked together to form networks, which eventually led to the development of the Internet. Fourth generation computers also saw the development of GUI's, the mouse and handheld devices.












Wednesday, August 15, 2012

Computer

What is computer?
The word computer is derived from the Latin word "Computare" which means to calculate. Basically, computer is defined as a programmable machine which is computer. computer is an electronics machine that can accept data ; process it according to a set of predefined instruction and then gives the result. Computer is an advanced electronic device that takes raw data as input from the user and process these data under the control of set of instructions called program, it gives the result from the output. It can process both numerical and non-numerical ( arithmetic and logical) calculations.

Computers are categorized by both size and the number of people who can use them concurrently. Supercomputers are sophisticated machines designed to perform complex calculations at maximum speed; they are used to model very large dynamic systems, such as weather patterns. Mainframes, the largest and most powerful general-purpose systems, are designed to meet the computing needs of a large organization by serving hundreds of computer terminals at the same time. Minicomputers, though somewhat smaller, also are multiuser computers, intended to meet the needs of a small company by serving up to a hundred terminals. Microcomputers, computers powered by a microprocessor, are subdivided into personal computer and workstations, the latter typically incorporating RISC processor. Although microcomputers were originally single-user computers, the distinction between them and minicomputers has blurred as microprocessors have become more powerful. Linking multiple microcomputers together through a local area network or by joining multiple microprocessors together in a parallel-processing system has enabled smaller systems to perform tasks once reserved for mainframes, and the techniques of grid computing have enabled computer scientists to utilize the unemployed processing power of connected computers.
Advances in the technology of integrated circuits have spurred the development of smaller and more powerful general-purpose digital computers. Not only has this reduced the size of the large, multi-user mainframe computers-which in their early years were large enough to walk through-to that of large pieces of furniture, but it has also made possible powerful, single-user personal computers and workstations that can sit on a desktop. These, because of their relatively low cost and versatility, have largely replaced typewriters in the workplace and rendered the analog computer inefficient.
  •  Analog Computers
An analog computer represents data as physical quantities and operates on the data by manipulating the quantities. It is designed to process data in which the variable quantities vary continuously (see analog circuit); it translates the relationships between the variables of a problem into analogous relationships between electrical quantities, such as current and voltage, and solves the original problem by solving the equivalent problem, or analog, that is set up in its electrical circuits. Because of this feature, analog computers were especially useful in the simulation and evaluation of dynamic situations, such as the flight of a space capsule or the changing weather patterns over a certain area. The key component of the analog computer is the operational amplifier, and the computer's capacity is determined by the number of amplifiers it contains (often over 100). Although analog computers are commonly found in such forms as speedometers and watt-hour meters, they largely have been made obsolete for general-purpose mathematical computations and data storage by digital computers.
  • Digital Computers
A digital computer is designed to process data in numerical form (see digital circuit); its circuits perform directly the mathematical operations of addition, subtraction, multiplication, and division. The numbers operated on by a digital computer are expressed in the binary system; binary digits, or bits, are 0 and 1, so that 0, 1, 10, 11, 100, 101, etc., correspond to 0, 1, 2, 3, 4, 5, etc. Binary digits are easily expressed in the computer circuitry by the presence (1) or absence (0) of a current or voltage. A series of eight consecutive bits is called a "byte"; the eight-bit byte permits 256 different "on-off" combinations. Each byte can thus represent one of up to 256 alphanumeric characters, and such an arrangement is called a "single-byte character set" (SBCS); the de facto standard for this representation is the extended ASCII character set. Some languages, such as Japanese, Chinese, and Korean, require more than 256 unique symbols. The use of two bytes, or 16 bits, for each symbol, however, permits the representation of up to 65,536 characters or ideographs. Such an arrangement is called a "double-byte character set" (DBCS); Unicode is the international standard for such a character set. One or more bytes, depending on the computer's architecture, is sometimes called a digital word; it may specify not only the magnitude of the number in question, but also its sign (positive or negative), and may also contain redundant bits that allow automatic detection, and in some cases correction, of certain errors (see code; information theory). A digital computer can store the results of its calculations for later use, can compare results with other data, and on the basis of such comparisons can change the series of operations it performs. Digital computers are used for reservations systems, scientific investigation, data-processing and word-processing applications, desktop publishing, electronic games, and many other purposes.

  • Hybrid Computer
Hybrid computer are computers that are designed to provide functions and features that are found with both analog computers and digital computers. The idea behind this combined or hybrid computer model is to create a working unit that offers the best of both types of computers. With most designs, the analog computers of equipment provide efficient processing of differential equations, while the digital aspects of the computer address the logical operations associated with the system. 

By creating this type of integrated computer, the benefits of both analog and digital computing are readily available. A hybrid computer is extremely fast when it comes to managing equations, even when those calculations are extremely complicated. This advantage is made possible by the presence of the analog components inherent within the design of the equipment