Minggu, 25 September 2011

HISTORY OF COMPUTERS

The history of computers has been started since time immemorial. Since ancient times, the data processing has been performed by humans. Humans also find tools of mechanical and electronic (mechanical and electronic) to help human beings in the calculation and data processing in order to get results faster. Computers that we encounter today is a long evolution of human inventions from time immemorial in the form of mechanical device (mechanical) and electronic (electronic)

Today computers and supporting devices have been included in every aspect of life and work. Existing computer now has a greater ability than ordinary mathematics calculations. Among them is a computer system at the counter is able to read the code of supermarket shopping goods, the central telephone that handles millions of calls and communications, computer networks and the Internet that connects various places in the world.


Computer History by period are:

* Tools and Calculators Calculate Traditional Mechanical

* First Generation Computers

* Second Generation Computers

* Third Generation Computers

* Fourth Generation Computers

* Fifth Generation Computer



TRADITIONAL TOOLS CALCULATE and MEKANIKAbacus CALCULATOR, which emerged about 5000 years ago in Asia Minor and is still used in some places up to now can be considered as the beginning of this komputasi.Alat engine allows users to perform calculations using sliding beads arranged on a rack. The merchants in those days using the abacus to calculate trade transactions. Along with the emergence of a pencil and paper, especially in Europe, the abacus lost its popularity


After almost 12 centuries, came another invention in terms of computing machines. In 1642, Blaise Pascal (1623-1662), who at that time was 18 years old, found what he called a numerical wheel calculator (numerical wheel calculator) to help his father make tax calculations


This brass rectangular box called the Pascaline, used eight serrated wheel to add up the numbers to eight digits. This tool is the counter-based number ten. The weakness of this tool is limited only to addition


In 1694, a German mathematician and philosopher, Gottfried Wilhem von Leibniz (1646-1716) to improve Pascaline by creating a machine that can multiply. Just like its predecessor, this mechanical device works by using the wheels of serrations. By studying the notes and drawings made by Pascal, Leibniz can fine-tune his instrument.


It was only in 1820, mechanical calculators became popular. Charles Xavier Thomas de Colmar find a machine that can perform the four basic arithmetic functions. Colmar mechanical calculators, arithometer, presented a more practical approach in the calculation because the tool can perform addition, subtraction, multiplication, and division. With his ability, arithometer widely used until World War I. Together with Pascal and Leibniz, Colmar helped build a mechanical computing era.


Beginning of the computer that is actually formed by a British mathematics professor, Charles Babbage (1791-1871). 1812, Babbage noticed the natural fit between the mechanical and mathematical machinery of mechanical machines are very good at doing the same tasks repeatedly without mistake; mediocre mathematician requires a simple repetition of a tertenu steps. These problems grow up to put the machine kemudain mechanics as a tool to answer the needs of mechanics. Babbage's first attempt to address this problem emerged in 1822 when he proposed a machine to melakukanperhitungan differential equations. The machine was called the Differential Engine. By using steam power, the machine can store programs and can perform calculations and print the results automatically.


After working with Differential Machine for ten years, Babbage was suddenly inspired to start making general-purpose computer first, called the Analytical Engine. Babbage's assistant, Augusta Ada King (1815-1842) has an important role in the manufacture of this machine. He helped revise the plan, seek funding from the British government, and communicate the specifications of the Analytical Engine to the public. In addition, Augusta is a good understanding of this machine makes it possible to put instructions into the machine and also make it the first female programmer. In 1980, the U.S. Defense Department named a programming language with the name of the ADA as a tribute to him.


Babbage's steam engine, although it is never done, looks very primitive compared to today's standards. However, these tools describe the basic elements of a modern computer, and also reveals an important concept. Consisting of about 50,000 components, the basic design of the Analytical Engine to use the cards perforations (holes) which contains the operating instructions for the machine.


In 1889, Herman Hollerith (1860-1929) also apply the principle of perforated cards to perform calculations. His first task is to find a faster way to perform calculations for the United States Census Bureau. Previous census conducted in 1880 took seven years to complete the calculation. With growing population, the Bureau estimates that it takes ten years to complete the census count.


Hollerith used perforated cards to enter the census data is then processed by means of mechanically. A card can store up to 80 variables. By using these tools, the results of the census can be completed within six weeks. Besides having the advantage in speed, the card serves as data storage. The error rate calculation can also be reduced drastically. Hollerith later develop such a device and sell it to the public. He founded the Tabulating Machine Company in 1896 which later became International Business Machine (1924) after some time of the merger. Other companies such as Remington Rand and Burroghs also manufacture perforated card reader for business. Perforated cards used by businesses to permrosesan nd government data until 1960.


In the next period, several engineers made other new discoveries. Vannevar Bush (18901974) created a calculator to solve differential equations in 1931. Machine can solve complex differential equations that had been considered complicated by academics. The machine was very large and heavy because of hundreds of teeth and the shaft is required to perform the calculations. In 1903, John V. Atanasoff and Clifford Berry tried to make a computer electrically applying Boolean algebra in electrical circuits. This approach is based on the work of George Boole (1815-1864) in the form of a binary system of algebra, which states that any mathematical equation can be expressed as true or false. By applying the conditions are right and wrong into the electrical circuit in the form of connected-disconnected, Atanasoff and Berry made the first electric computer in 1940. But those projects stalled due to loss of funding sources.


COMPUTER FIRST GENERATION

With the onset of the Second World War, the countries involved in the war sought to develop computers to exploit their potential strategic importance computer. This increased funding for computer development projects hastened technical progress. In 1941, Konrad Zuse, a German engineer to build a computer Z3, to design airplanes and missiles.


Allies also made other progress in the development of computer power. In 1943, the British completed a secret code-breaking computer called Colossus to decode the secrets used by Germany. The Colossus's impact affecting the development of the computer industry because of two reasons. First, Colossus was not a computer versatile general-purpose computer), it is only designed to decode secret messages. Secondly, the existence of the machine was kept secret until decades after the war ended.


The work done by the Americans at that time produced a broader achievement. Howard H. Aiken (1900-1973), a Harvard engineer working with IBM, succeeded in producing electronic calculators for the U.S. Navy. Calculator-sized football field and a half feet long and has a wire range of 500 miles. The Harvd-IBM Automatic Sequence Controlled Calculator, or Mark I, an electronic relay computer. He uses electromagnetic signals to move the mechanical components. The machine was slow (taking 3-5 seconds per calculation) and inflexible (in order of calculations can not be changed). The calculator can perform basic arithmetic and more complex equations.


The other computer developments during this period was the Electronic Numerical Integrator and Computer (ENIAC), which is made by the cooperation between the governments of the United States and the University of Pennsylvania. Consisting of 18,000 vacuum tubes, 70,000 resistors and 5 million soldered joints, the computer is a machine that consumes huge power of 160kW. This computer was designed by John Presper Eckert (1919-1995) and John W. Mauchly (1907-1980), ENIAC is a versatile computer (general-purpose computers) that work 1000 times faster than the Mark I. In the mid-1940s, John von Neumann (1903-1957) joined the team of University of Pennsylvania to build the concept couples the computer up to 40 years is still used in computer engineering.


Von Neumann designed the Electronic Discrete Variable Automatic Computer (EDVAC) in 1945 with a good memory to hold data or programs. This technique allows the computer to stop at some point and then resume her job back. The key factor von Neumann architecture is the central processing unit (CPU), which allowed all computer functions to be coordinated through a single source. In 1951, UNIVAC I (Universal Automatic Computer I) made by Remington Rand, became the first commercial computer to use the von Neumann architecture model. Both the United States Census Bureau and General Electric have a UNIVAC. One of the impressive results achieved by the UNIVAC dalah success in predicting victory Dwilight D. Eisenhower in the 1952 presidential election.


First generation computers were characterized by the fact that operating instructions were made specifically for a particular task. Each computer has a program-different binary code that is called "machine language" (machine language). This causes the computer is difficult to be programmed and the speed limit. Another feature is the use of first generation computer vacuum tube (which makes the computer at that time are very large) and magnetic cylinders for the storage of data.


SECOND GENERATION COMPUTER

In 1948, the invention of the transistor greatly influenced the development of computers. The transistor replaced the vacuum tube in televisions, radios, and computers. As a result, the size of electric machines is reduced drastically. The transistor was used in computers began in 1956. Other findings in the form of the development of magnetic-core memory to second generation computers smaller, faster, more reliable, and more energy efficient than its predecessor. The first machine that utilizes this new technology is a supercomputer. IBM makes supercomputers, Stretch, and Sprery-Rand named LARC. These computers, which were developed for atomic energy laboratories, could handle large amounts of data, a capability that is needed by atomic scientists. The machine was very expensive and tend to be too complex for business computing needs, thereby limiting their attractiveness. There are only two LARC has ever installed and used: one at the Lawrence Radiation Labs in Livermore, Calif., and the other in the U.S. Navy Research and Development Center in Washington DC Second generation computers replaced machine language with assembly language. Assembly language is a language that uses abbreviations to replace the binary code.


In the early 1960s, began to appear successful second generation computers in business, in universities and in government. The second generation of computers is a computer which used transistors. They also have components that can be associated with the computer at this time: a printer, storage disks, memory, operating system, and programs. One important example is the computer on the IBM 1401 which is widely accepted in the industry. In 1965, almost all large businesses use computers to process the second generation of financial information.


Programs stored in computers and programming languages ​​that exist in it gives flexibility to the computer. Flexibility is increased performance at a reasonable price for business use. With this concept, the computer can print invoices and then run the consumer purchases the product design or calculate payroll. Some programming languages ​​began to appear at that time. Programming language Common Business-Oriented Language (COBOL) and FORTRAN (Formula Translator) came into common use. These languages ​​replaced cryptic binary machine code with words, sentences, and math formulas more easily understood by humans. This allows a person to program and set the computer. Various New types of careers (programmer, analyst, and computer systems expert). Software industry also began to emerge and evolve during this second generation of computers.


THIRD GENERATION COMPUTER

Although the transistors in many respects the vacuum tube, but transistors generate substantial heat, which could potentially damage the internal parts of a computer. Quartz stone (quartz rock) eliminates this problem. Jack Kilby, an engineer at Texas Instruments, developed the integrated circuit (IC: integrated circuit) in 1958. IC combined three electronic components onto a tiny silicon disc made of quartz sand. Scientists later managed to fit more components into a single chip called a semiconductor. The result, computers became ever smaller as the components can be squeezed onto the chip. Other third-generation development is the use of the operating system (operating system) that allows the machine to run many different programs at once with a central program that monitored and coordinated the computer's memory.


FOURTH GENERATION COMPUTER

After IC, the development becomes more apparent that shrink the size of circuits and electrical components. Large Scale Integration (LSI) could fit hundreds of components on a chip. In the 1980's, the Very Large Scale Integration (VLSI) contains thousands of components on a chip tunggal.Ultra-large scale integration (ULSI) increased that number into the millions. Ability to install so many components in a half-sized pieces of coins falling prices encourage and size of the computer. It also increased power, efficiency and reliability of computers. Intel 4004 chip that was made in 1971 to bring progress to the IC by putting all the components of a computer (central processing unit, memory, and control input / output) in a small boiling chip. Previously, the IC is made to do a certain task specific. Now, a microprocessor could be manufactured and then programmed to meet all the requirements. Not long after, everyday household items like microwave ovens, televisions, and automobiles with electronic fuel injection incorporated microprocessors.


Such developments allow ordinary people to use a regular computer. Computers no longer be a dominant big companies or government agencies. In the mid-1970s, computer assemblers offer their computer products to the general public. These computers, called minicomputers, sold with software packages that are easy to use by the layman. The most popular software at the time was word processing and spreadsheet programs. In the early 1980s, such as the Atari 2600 video game consumer interest for more sophisticated home computer and can diprogram.Pada 1981, IBM introduced the use of Personal Computer (PC) for use in homes, offices, and schools. The number of PCs in use jumped from 2 million units in 1981 to 5.5 million units in 1982. Ten years later, 65 million PCs in use. Computers continued evolution towards smaller sizes, from computers that are on the table (desktop computer) into a computer that can be inserted into the bag (laptop), or even a computer that can be hand held (palmtop).


IBM PC to compete with the Apple Macintosh in the fight over the computer market. Apple Macintosh became famous for popularizing the graphics system on the computer, while his rival was still using a text-based computer. Macintosh also popularized the use of mouse devices.


At the present time, we know the journey with the use of IBM compatible CPU: IBM PC/486, Pentium, Pentium II, Pentium III, Pentium IV (series of CPUs made by Intel). Also we know AMD K6, Athlon, etc.. This is all included in the class of fourth-generation computers. Along with the proliferation of computer usage in the workplace, new ways to explore the potential of being developed. Along with the increased strength of a small computer, these computers can be connected together in a network to share a memory, software, information, and also to be able to communicate with each other. Computer networks allow computers to form a single electronic collaboration to complete an assignment process. By using direct cabling (also called local area network, LAN), or telephone cable, the network can become very large.


FIFTH GENERATION COMPUTER

Defining the fifth generation computer becomes quite difficult because this stage is still very young. Examples are the fifth generation computer imaginative fictional HAL9000 computer from the novel by Arthur C. Clarke titled 2001: Space Odyssey. HAL displays all the functions you want from a fifth-generation computer. With artificial intelligence (artificial intelligence), the HAL may have enough reason to do percapakan with humans, using visual feedback, and learn from his own experience.


Although it may be the realization of the HAL9000 is still far from reality, many of the functions that had been established. Some computers can receive verbal instructions and are able to mimic human reasoning. The ability to translate a foreign language also becomes possible. This facility looks sederhan. However, such facilities become much more complicated than expected when programmers realized that human understanding relies heavily on context and meaning rather than just translate the words directly.


Many advances in the field of computer design and technology increasingly allows the manufacture of fifth generation computers. Two engineering advances which are mainly parallel processing capabilities, which will replace the von Neumann model. Von Neumann model will be replaced with a system capable of coordinating many CPUs to work in unison. Another advancement is the superconducting technology that enables the flow of electrically without any obstacles, which in turn can accelerate the speed of information.


Japan is a country well known in the jargon of socialization and the fifth generation computer project. Institute ICOT (Institute for New Computer Technology) was also formed to make it happen. Many news stating that the project has failed, but some other information that the success of this fifth generation computer project will bring new changes in the computerized world paradigm. We wait for which information is more valid and fruitful.

Tidak ada komentar:

Posting Komentar