Thursday, 12 November 2009

History of Computers

Source - Wikipedia

The history of computing hardware is the record of the constant drive to make computer hardware faster, cheaper, and store more data.

Before the development of the general-purpose computer, most calculations were done by humans. Tools to help humans calculate are generally called calculators. Calculators continue to develop, but computers add the critical element of conditional response, allowing automation of both numerical calculation and in general, automation of many symbol-manipulation tasks. Computer technology has undergone profound changes every decade since the 1940s.

Computing hardware has become a platform for uses other than computation, such as automation, communication, control, entertainment, and education. Each field in turn has imposed its own requirements on the hardware, which has evolved in response to those requirements.

Aside from written numerals, the first aids to computation were purely mechanical devices that required the operator to set up the initial values of an elementary arithmetic operation, then propel the device through manual manipulations to obtain the result. An example would be a slide rule where numbers are represented by points on a logarithmic scale and computation is performed by setting a cursor and aligning sliding scales. Numbers could be represented in a continuous "analog" form, where a length or other physical property was proportional to the number. Or, numbers could be represented in the form of digits, automatically manipulated by a mechanism. Although this approach required more complex mechanisms, it made for greater precision of results.

Both analog and digital mechanical techniques continued to be developed, producing many practical computing machines. Electrical methods rapidly improved the speed and precision of calculating machines, at first by providing motive power for mechanical calculating devices, and later directly as the medium for representation of numbers. Numbers could be represented by voltages or currents and manipulated by linear electronic amplifiers. Or, numbers could be represented as discrete binary or decimal digits, and electrically-controlled switches and combinatorial circuits could perform mathematical operations.

The invention of electronic amplifiers made calculating machines much faster than mechanical or electromechanical predecessors. Vacuum tube amplifiers gave way to discrete transistors, and then rapidly to monolithic integrated circuits. By defeating the Tyranny of numbers, integrated circuits made high-speed and low-cost digital computers a widespread commodity.

This article covers major developments in the history of computing hardware, and attempts to put them in context. For a detailed timeline of events, see the computing timeline article. The history of computing article treats methods intended for pen and paper, with or without the aid of tables. Since all computers rely on digital storage, and tend to be limited by the size and speed of memory, the history of computer data storage is tied to the development of computers.

No comments:

Post a Comment