Chapter 4: The Semiconductor Revolution
The advent of semiconductors, in particular of the transistor and subsequently of the integrated circuit, represents one of the most significant milestones in the history of information technology. These inventions not only dramatically improved the performance and reliability of computers, but also enabled their diffusion into new fields and set the stage for the subsequent personal computer revolution.
4.1 The Invention of the Transistor:
A Revolutionary Substitute for Vacuum Tubes
As we saw in the previous chapter, the first computers were based on the use of thermionic valves, devices which, despite having made electronic calculation possible, had numerous disadvantages.
The search for more efficient and reliable alternatives led, in December 1947, to the invention of the transistor at Bell Telephone Laboratories (Bell Labs) by John Bardeen, Walter Brattain and William Shockley. This event marked the beginning of a new era in electronics.
A transistor is a semiconductor device capable of amplifying or switching electronic signals. Unlike vacuum tubes, which operate in a vacuum, transistors are solid-state devices, made of semiconductor materials such as silicon or germanium, whose electrical conductivity properties can be controlled by varying the voltage applied to a control electrode.
The first transistors were of the bipolar junction type (BJT), but later field effect transistors (FET) became widespread, which offered further advantages in terms of energy consumption and miniaturization.
The introduction of the transistor brought with it a series of crucial advantages over vacuum tubes:
- Small size: Transistors were significantly smaller and lighter than tubes, allowing the construction of much more compact computers.
- Lower power consumption: Transistors required much less power to operate, reducing operating costs and heat dissipation issues.
- Less heat generation: Less heat dissipation increased circuit reliability and simplified cooling systems.
- Greater reliability and durability: Transistors were much more robust and had a significantly longer lifespan than fragile vacuum tubes.
- Lower cost: Transistor production was potentially less expensive than valve production, especially on a large scale.
- Faster switching speed: Transistors could switch states (from on to off and vice versa) much faster than valves, allowing for faster computers.
The invention of the transistor earned Shockley, Bardeen and Brattain the Nobel Prize for Physics in 1956, testifying to its enormous scientific and technological importance.
4.2 The First Transistor Computers:
A New Generation of Machines
The adoption of transistors in computer hardware was not immediate, but gradually, throughout the 1950s and early 1960s, transistor-based computers began to replace those based on vacuum tubes. This transition led to the so-called second generation of computers .
Transistor computers offered better performance, greater reliability, and a smaller footprint than their tube-based predecessors. Some notable examples of transistor computers include:
- IBM 7090: Introduced in 1959, it was one of IBM's first mainframes to use transistors. It was much faster and more reliable than its tube-based predecessor, the IBM 704.
- DEC PDP-1: Launched in 1960 by Digital Equipment Corporation (DEC), the PDP-1 was one of the first successful minicomputers.
- Its small size and relatively lower cost compared to mainframes made it accessible to a wider audience, including research laboratories and universities.
- Atlas: A computer developed jointly by the University of Manchester and Ferranti International in 1962, it was one of the first to use virtual memory, a technique that allowed programs to run larger than the available physical memory.
The transition to transistors was not only a technological change, but opened up new possibilities for computer architecture and the applications that could be run. The greater reliability of transistors, for example, allowed more complex systems to be built and longer calculations to be performed without frequent interruptions.
4.3 The Development of Integrated Circuits:
Miniaturization Pushed to the Extreme
If the transistor represented a revolution in the basic electronic component, the development of the integrated circuit (IC) , also known as the microchip, was a further breakthrough that brought the miniaturization and performance of computers to unprecedented levels. The fundamental idea behind the integrated circuit was to fabricate multiple electronic components (such as transistors, resistors, capacitors) interconnected on a single substrate of semiconductor material, typically silicon.
The invention of the integrated circuit is independently attributed to two engineers:
- Jack Kilby: At Texas Instruments, in 1958, Kilby created the first working integrated circuit, demonstrating the possibility of combining multiple components on a single germanium chip.
- Robert Noyce: At Fairchild Semiconductor, in 1959, Noyce developed an integrated circuit based on silicon, which offered greater reliability and ease of production than germanium. Noyce's design also included a method for interconnecting components on the chip via a planarization process, which proved critical for mass production.
The first integrated circuits contained a limited number of components (typically from a few to dozens) and fell into the category of Small-Scale Integration (SSI) . However, even these early chips demonstrated the technology's notable benefits:
- Further reduction in size and weight: The integration of multiple components on a single chip allowed even more compact electronic circuits to be built.
- Lower cost: Mass production of integrated circuits promised to significantly reduce the cost per component.
- Greater speed: Electrical signals had to travel much shorter distances within a chip, reducing delays and increasing processing speed.
- Increased reliability: Reducing the number of external connections between components decreased the likelihood of failure.
- Lower power consumption: Integration on a single chip allowed further reduction of power consumption.
The introduction of integrated circuits marked the beginning of the third generation of computers , characterized by a significant reduction in size, an increase in performance and greater reliability.
4.4 The Impact on Computers:
Towards More Powerful and Accessible Machines
The advent of transistors and integrated circuits had a transformative impact on the computer industry:
- Downsizing: Computers became progressively smaller, moving from the huge rooms occupied by mainframes to systems that could fit in cabinets or even on a desk.
- Increased speed: The faster switching speed of transistors and shorter distances within integrated circuits allowed computers with ever greater computing capabilities.
- Greater reliability: The elimination of fragile vacuum tubes and the reduction in the number of external connections led to systems that were much more reliable and less prone to failure.
- Cost reduction: The mass production of transistors and integrated circuits allowed the production costs of computers to be reduced, potentially making them accessible to a wider audience.
A direct result of the semiconductor revolution was the emergence of the minicomputer . These computers, such as the aforementioned DEC PDP-1 and subsequent PDP series models, were smaller, less expensive, and easier to use than large mainframes. This made them ideal for research laboratories, university departments and small and medium-sized businesses, democratizing access to computing power.
4.5 Moore's Law:
A Prophetic Observation
In 1965, Gordon Moore, a founder of Fairchild Semiconductor and later Intel, published a paper in which he noted that the number of transistors that could be integrated into an integrated circuit was doubling approximately every year (later revised to approximately every 18 to 24 months). This observation, which became known as Moore's Law , was not a physical law, but rather a prediction based on current technological trends.
Remarkably, Moore's Law has proven remarkably accurate for several decades, driving the evolution of the semiconductor industry and leading to exponential growth in computing power and continued reductions in costs. The increase in transistor density on chips has made it possible to integrate more and more functionality into microprocessors and memories, fueling innovation in all fields of computing.
The semiconductor revolution, with the invention of the transistor and the development of integrated circuits, represents a turning point in the history of computing. These technologies made it possible to overcome the limitations of systems based on vacuum tubes, leading to the creation of smaller, faster, more reliable and less expensive computers. The emergence of minicomputers was a direct consequence of these advances, making computing power accessible to a wider audience. Moore's Law, with its prediction of exponential growth in transistor density, drove the evolution of hardware for many years to come. These developments laid the foundation for the subsequent explosion of personal computing and the digital technologies that permeate our daily lives today.