Math Science Chemistry Economics Biology News Search
The development of the Microprocessor was a milestone in the history of computers. It allowed their miniaturization and expansion on the market. Thanks to microprocessors the computer became a tool available to everyone.
The microprocessor is an integrated circuit, the miniaturized electronic device. Its parts are made with a technological cycle on the surface or inside a silica plate not exceeding 1cm of size and 0,1 mm of thickness. That little black plate is filled with very small structures. The computer can calculate, can “think” thanks of them.
The history of computers began much earlier than the production of microprocessors. The first calculating machine that was called “computer” had been built in 1946 by American scientists. Its name was ENIAC and it contained 18 electron lamps and 1500 relays. It filled a room of 9x15 meters of size and weighed 30 tons. Because of its price and size it couldn’t become a popular device on the market.
American scientists developed a new semiconductor electronic part – the transistor. John Bardeen and Walter Hauser Brattin developed the point contact transistor in 1947 and Walter Hauser Brattin the junction transistor in 1949. It started the avalanche of development that led straight to the construction of microprocessor.
Transistors have two important features:
► They allow the switching of the current in the electronic device (it is the main function of all digital computers),
► They allow the analog signal to amplify.
The first transistors were built in bipolar technology. It means that two carriers take the part in transporting the electrical load - electrons and holes (empty spaces after the electrons). Holes have positive charge, electrons have a negative one.
Later research led to the development of the the unipolar transistor having the carriers of one type only: electrons or holes. It was the main step toward the miniaturization of the transistor.
In the turn of 60’s and 70’s Intel made the important technological jump. They tried to make the semiconductor memory usable in practice and produced the integrated circuit containing above 1000 transistors – the first microprocessor. Its history began in 1969 when the Japanese firm Busicom sent the order to Intel for creating sets of chips for calculators. That time logical chips were built separately for every device. Engineers from Intel, under the direction of Ted Hoff, changed the order and developed a one-chip universal logical device that could be used not only for calculators. The first processor 4004 had the size of a nail. It was made in P-MOS technology and contained 2300 transistors and few other micro devices that together had the computational power equal to ENIAC’s. New products started the competition between companies endeavoring to create faster processors of higher computational power. The first microprocessor was a 4-bit chip. Information could be processed in 4-bit packets. Processors which developed later had 8, 16, 32 and more bits.
After the 4004 Intel introduced the 8080 microprocessor, the first one widely used in electronic devices. In 1981 in the Intel’s family of processors appeared the 16-bit 8086 and the 8-bit 8088. The latter was used by the giant of computer market IBM in the first personal computer. The following year Intel introduced the 286 chip that contained 134000 transistors. It was three times faster than the 16-bit processors from other companies. It had an internal memory management unit and it was the earliest software compatible with the preceding ones. Few years later the Pentium Pro and Pentium II appeared. Previously other firms had also produced Intel processors: Advanced Micro Devices (AMD), Cyrix, Siemens. Later, when Intel decided to produce chips in its own plants, competitors developed their own processor models.
The increase of microprocessor speed occurred simultaneously with the decrease of their price. That’s why more than 200 millions of personal computers are in use and their number is still growing.
The main purpose of producers hasn’t changed since the time when the first computer was built:
► maximalization of working speed,
► minimalization of power needed for working,
► increase of integration level.
Devices that in 60’s and 70’s needed hundreds or thousands of parts now can be made into only one microprocessor chip. Its great advantage is its adaptability to various software and computers.
► All that thanks to a silica semiconductor plate not greater than a nail. Every microprocessor contains two main blocks that have well defined tasks.
► The first is the control unit. It controls internal work based on the interpretation of the instructions loaded from memory. It responds to external signals and generates signals to organize the cooperation with the other parts of the system: memory, input and output peripheral devices.
► The second unit calculates arithmetical and logical operations on operands loaded from memory or registers and stores the results in the allocation tables. Operations are chosen by the control unit.
During the production process of microprocessors great precision and cleanness are needed. The manufacturing process is carried on in rooms where every 30 cm of air contains less than 1000 particle of dust. The humidity must be 0, temperature must be constant at 20 °C and workers wear special protective shoes, aprons, gloves, caps and even masks. Those protections are necessary because the smallest particles of dust or hair can damage a whole group of integrated circuits. Chips manufacturing technology has changed very much. Previously most of the job was hand-made and lots of chips were destroyed.