Integrated circuits developed from transistor technology as scientists sought ways to build more transistors into a circuit. The first integrated circuits were patented in 1959 by two Americans-Jack Kilby, an engineer, and Robert Noyce, a physicist-who worked independently. Integrated circuits had caused a great revolution in electronics in the 1960's as transistors had caused in 1950's. The circuits were first used in military equipment and space craft and helped make possible the first human space flights of the 1960's. They were soon being 3 used in household electronic products, such as sewing machines, microwave ovens, and television sets.
Most integrated circuits are small pieces, or “chips,” of silicon, perhaps
(0.08 to 0.15 sq in) long, in which transistors are fabricated. Photolithography
enables the designer to create tens of thousands of transistors on a single chip
by proper placement of the many n-type and p-type regions. These are
interconnected with very small conducting paths during fabrication to produce
complex special-purpose circuits. Such integrated circuits are called monolithic
because they are fabricated on a single crystal of silicon. Chips require much
less space and power and are cheaper to manufacture than an equivalent circuit
built by employing individual transistors. Integrated circuits (ICs) make the
microcomputer possible; without them, individual circuits and their components
would take up far too much space for a compact computer design. The typical IC
consists of elements such as resistors, capacitors, and transistors packed on a
single piece of silicon. In smaller, more densely packed ICs, circuit elements
may be only a few atoms in size, which makes it possible to create sophisticated
computers the size of notebooks. A typical computer circuit board features many
integrated circuits connected together. |