Chapter 5

Chapter 5

The Evolution of Architecture (1960-1970)

The advent of transistors and integrated circuits not only allowed faster and more reliable computers to be built, but also profoundly influenced their internal architecture. The decade 1960-1970 saw the emergence of new architectural concepts that moved away from the monolithic models of the first mainframes, paving the way for systems that were more modular, flexible and suitable for a wider range of applications.

5.1 The Emergence of Minicomputers and New Architectures

As mentioned in the previous chapter, one of the most significant developments of this period was the rise of minicomputers . These systems, such as the DEC PDP-8 (introduced in 1965) and the HP 2100 (introduced in 1966), differed from large mainframes in several architectural features:

  • Word Length: Minicomputers often used shorter word lengths than mainframes (e.g., 12 or 16 bits versus 36 or 60 bits on mainframes). This trade-off reduced hardware complexity and cost while still providing significant computing power for many applications.
  • Memory size: The amount of addressable main memory from minicomputers was generally smaller than mainframes, but still sufficient for many medium-sized scientific, process control, and data processing applications.
  • Instruction sets: Minicomputer instruction sets were often simpler and more compact than those of mainframes, focusing on the most frequently used operations.
  • More modular architecture: Minicomputers were often designed with a more modular architecture, which allowed the system to be configured with different amounts of memory, peripherals, and interfaces based on the user's specific needs.
  • Greater emphasis on interactivity: Unlike mainframes, which often operated in batch mode, minicomputers began to support more direct modes of interaction with the user, paving the way for time-sharing systems and interactive applications.

The architecture of the DEC PDP-8, for example, was considerably simpler than that of the large IBM mainframes of the time. It used a single bus (Omnibus) to interconnect the CPU, memory and peripherals, simplifying system design and maintenance. This more streamlined architecture contributed to the success of the PDP-8 and its diffusion in numerous laboratories and universities.

5.2 More Advanced Memory Concepts

The evolution of computer architecture in this period was closely linked to advances in memory technologies. ferrite core memory became the dominant main memory (RAM) technology due to its reliability, relatively fast access speed (compared to magnetic drums) and non-volatility (retained data even in the absence of electrical power).

The organization of ferrite core memory was typically three-dimensional, with the cores arranged on a grid crossed by selection and read/write wires.

Addressing the memory was done by selecting the appropriate lines of wire to magnetize or read the state of a particular core. The capacity of main memory in minicomputers grew significantly during the 1960s, from a few kilobytes to tens of kilobytes.

Furthermore, the first concepts of memory hierarchy began to appear in this period. The idea was to use different types of memory with different speed and cost characteristics to optimize system performance. For example, some systems might include a small amount of very fast (though expensive) memory, used as a sort of "cache" for the most frequently used data and instructions, in combination with a larger amount of slower, cheaper main memory. This early form of memory hierarchy anticipated the sophisticated caching systems that would become a fundamental feature of modern computer architectures.

5.3 Introducing the Interrupt

A key architectural innovation introduced in this period was the interrupt mechanism . In early computers, the CPU had to constantly poll input/output devices to see if they needed attention (for example, if an input device had new data ready or if an output device had completed an operation). This approach was inefficient, as the CPU wasted precious clock cycles waiting for events.

The introduction of interrupts allowed external devices to signal the need for attention directly to the CPU.

When a device generated an interrupt, the CPU temporarily suspended execution of the current program, saved its state (for example, the contents of the registers and the address of the next instruction to be executed), and went on to execute a special interrupt handler routine associated with that device. Once the interrupt handling was completed, the CPU restored the state of the interrupted program and resumed its execution from the point at which it was interrupted.

The introduction of interrupts significantly improved the efficiency of computer systems, allowing the CPU to spend more time executing programs and respond more quickly to external events. This mechanism was crucial for the development of more sophisticated operating systems and for supporting more dynamic interactions with users and the outside world.

5.4 The Introduction of DMA (Direct Memory Access)

Another important architectural evolution was the introduction of DMA (Direct Memory Access) . Initially, all data transfers between input/output devices and main memory had to go through the CPU. This approach could overload the CPU, especially for large data transfers, such as reading or writing to disk or magnetic tape.

DMA introduced a mechanism that allowed certain peripheral devices to directly access main memory to transfer data, without the continuous intervention of the CPU. A DMA controller was in charge of managing the data transfer, freeing the CPU to perform other operations.

Once the transfer was complete, the DMA controller could generate an interrupt to notify the CPU that the data was ready in memory or that the write was complete.

The introduction of DMA greatly improved system performance, especially for high-speed input/output operations, allowing large amounts of data to be transferred more efficiently and freeing the CPU for other processing tasks.

5.5 16-bit Architecture and Beyond

As previously mentioned, minicomputers often adopted architectures with shorter word lengths than mainframes. However, during the 1960s, there was a trend towards longer word lengths. The introduction of 16-bit architectures in minicomputers, such as the DEC PDP-11 (introduced in 1970), represented a significant step forward. A longer word length allowed more memory to be addressed and larger data to be manipulated in a single operation, improving overall system performance and paving the way for more complex software.

At the same time, the world of mainframes also continued to evolve towards architectures with longer word lengths, such as the IBM System/360 systems which used 32-bit words (and also 64-bit formats for floating point numbers), offering greater computing power and addressing capacity.

5.6 The Influence of Programming Languages ​​on Architecture

It is important to note that the evolution of computer architecture did not occur in isolation, but was often influenced by the needs of the high-level programming languages ​​that were emerging. Languages ​​such as FORTRAN and COBOL, with their specific demands on manipulating numbers, arrays, strings, and data structures, pushed architectural designers to include instruction sets and memory management mechanisms that could efficiently support such operations. This interaction between the needs of the software and the capabilities of the hardware was a fundamental driver of the evolution of computing.

The decade 1960-1970 was a crucial period in the evolution of computer architecture. The emergence of minicomputers with their more modular and interactive architectures, the introduction of more advanced memory concepts such as memory hierarchy, and the implementation of fundamental mechanisms such as interrupts and DMA represented significant steps towards more powerful, flexible, and efficient computing systems. The trend toward longer word lengths and the growing influence of programming languages ​​on architecture set the stage for the innovations that would characterize the following decades, culminating in the microprocessor and personal computer revolutions.