Chapter 3
The Initial Software (1950-1960)
If the hardware constituted the physical structure of the first computers, the software represented the intelligence that animated them, allowing electronic circuits to be transformed into calculation and information processing tools. In the early years of computing, software development was a pioneering activity, often closely tied to detailed knowledge of hardware architecture. In this chapter, we will explore early forms of software, from machine and assembly languages to early high-level programming languages and rudimentary operating system concepts.
3.1 Machine Language:
Talking Directly to the Hardware
The most basic form of software is machine language . This is the native language understood directly by the central processing unit (CPU) of a computer. Each type of CPU has its own specific machine language, consisting of a sequence of binary instructions (sequences of 0s and 1s). Each machine language instruction corresponds to an elementary operation that the CPU can perform, such as:
- Arithmetic operations: Add, subtract, multiply, divide numbers.
- Logical operations: Perform Boolean logical operations (AND, OR, NOT) on bits.
- Data Transfer: Move data between memory and CPU registers.
- Flow control: Change the order in which instructions are executed (conditional and unconditional jumps).
Programming directly in machine language was an extremely arduous and complex task for several reasons:
- Reading and writing difficulties: Binary sequences are difficult for humans to interpret and remember. A simple program might require long and intricate sequences of 0s and 1s.
- Dependence on hardware architecture: Machine language is specific to a particular CPU architecture. A program written for one type of computer could not run on another with a different architecture.
- High risk of errors: Direct manipulation of binary sequences made it extremely easy to make typos or logic errors.
- Debugging Difficulty: Finding and correcting errors in a program written in machine language was a long and frustrating process.
Despite these difficulties, early programmers necessarily had to work at this level to make the first computers work. Programming in machine language required a deep understanding of computer hardware architecture and great attention to detail.
3.2 Assembly Language:
A Step Towards Abstraction
To simplify the task of programming, assembly language was developed.
This language is a symbolic representation of machine language. Instead of using binary sequences, assembly language uses mnemonics (short abbreviations) to represent machine language instructions and symbolic names to represent memory addresses.
For example, a machine language instruction to add two numbers might be represented in assembly language with a mnemonic such as ADD followed by the names of the registers or memory addresses containing the numbers to be added.
In order to run a program written in assembly language on a computer, a special program called assembler was needed. The assembler translates each assembly language instruction into the corresponding machine language instruction.
The use of assembly language offered several advantages over direct machine language programming:
- Greater readability and writableness: Mnemonics were easier to remember and interpret than binary sequences.
- Use of symbolic names: Allowed programmers to refer to memory addresses and other entities using symbolic names instead of numeric addresses, simplifying memory management and reducing the risk of errors.
- Increased productivity: Assembly programming was faster and less error-prone than machine language programming.
However, assembly language still remained a low-level language, closely tied to the computer's hardware architecture.
Programmers still had to have a good understanding of the inner workings of the CPU and memory. Furthermore, a program written in assembly for one particular architecture could not easily be ported to another.
3.3 The First High Level Programming Languages:
Towards Hardware Independence
The growing complexity of the problems that computers wanted to solve and the difficulty of programming in low-level languages led to the development of high-level programming languages . These languages were designed to be closer to human language and more independent of the computer's specific hardware architecture. A single command in a high-level language could correspond to several machine language instructions.
In order to run a program written in a high-level language, a special program called a compiler or interpreter was needed. A compiler translates the entire high-level language program into machine language (or an intermediate language) before execution. An interpreter, on the other hand, translates and executes the program line by line.
In the 1950s, some of the first and most influential high-level programming languages emerged:
- FORTRAN (FORmula TRANslation): Developed by a team led by John Backus at IBM starting in 1954, FORTRAN was designed specifically for scientific and engineering applications that required intensive numerical computations. FORTRAN offered a syntax more similar to mathematical notation and provided specific constructs for manipulating arrays and matrices.
- Its success was immediate in the scientific and engineering community, and FORTRAN continues to be used today in some areas of research and development. Its introduction represented a fundamental step towards the simplification of scientific programming.
- COBOL (COMmon Business-Oriented Language): Developed starting in 1959 by a committee led by Grace Hopper, COBOL was designed for business and data processing applications. The goal was to create a language that was easy to understand and use by non-IT experts, such as managers and business analysts. COBOL emphasized readability and provided specific constructs for managing large amounts of data, creating reports, and interacting with files. COBOL quickly became the dominant language for business applications and has maintained considerable prominence for decades. Many critical enterprise legacy systems are still written in COBOL today.
- LISP (LISt Processing): Developed by John McCarthy at MIT in 1958, LISP was designed for artificial intelligence research and for symbol and list manipulation. Its syntax, based on the extensive use of parentheses, was radically different from that of FORTRAN and COBOL. LISP introduced innovative concepts such as recursion and treating functions as data. Although it has not achieved the popularity of FORTRAN and COBOL in other fields, LISP has had a significant influence on artificial intelligence research and the development of other programming languages.
The development of these early high-level languages represented a huge simplification for programmers, allowing them to focus on the logic of the problem to be solved rather than on the details of the hardware architecture.
This led to greater productivity and the ability to tackle more complex problems with computers.
3.4 Early Operating System Concepts (Batch Processing)
In the early years of computing, there were no complex operating systems like the ones we are used to today. Interaction with the computer was often direct, and the programmer had to manually manage all the machine's resources. However, as computers increased in power and complexity, the need arose for software that could manage hardware resources and simplify program execution.
One of the earliest forms of job management on computers was batch processing . In this approach, a set of jobs (i.e., programs to be executed with their input data) were collected into a "batch". The computer operator loaded the batch of jobs into the computer, which executed them one after the other in sequence, without direct intervention from the programmer during execution. The results of each job were then produced as output (for example, to magnetic tape or printer).
Batch processing had some advantages:
- Greater efficiency: It allowed the computer to be used more continuously, reducing downtime between the execution of one job and another.
- Simplification of management: The operator only had to load the initial batch and then collect the results, without having to continuously interact with the computer.
However, batch processing also had limitations:
- Lack of interactivity: The programmer could not interact with the program during execution. If an error occurred, you had to wait until the job was finished to parse the output.
- Long wait times: If a job was long, other jobs in the batch had to wait for it to complete.
Despite these limitations, batch processing represented a first step towards the development of modern operating systems, introducing the concept of automatic management of program execution.
3.5 Challenges and Innovations in Early Software Development
Software development in the early years of computing was an extremely challenging activity. Programmers faced numerous challenges:
- Expensive and limited hardware: The hardware resources of early computers were scarce and expensive. Programmers had to carefully optimize their code to make the most of available memory and computing power.
- Rudimentary development tools: There were no integrated development environments, sophisticated debuggers, and software libraries that we are accustomed to today. Programming was often a manual and meticulous process.
- Lack of standardization: There were no widely accepted standards for programming languages and hardware interfaces, which made software portability difficult.
- Limited developer community: The programming community was small, and programming knowledge and techniques were constantly evolving.
Despite these difficulties, software pioneers demonstrated great creativity and ingenuity in developing the first programs and laying the foundation for the discipline of software engineering. The introduction of high-level languages, although initially greeted with skepticism by some who preferred the control offered by low-level languages, proved to be a critical step in making programming more accessible and productive.
The early software of the 1950s was characterized by the transition from programming directly in machine language to the use of assembly languages and, above all, the first high-level programming languages such as FORTRAN, COBOL and LISP. These languages represented an important abstraction from hardware, simplifying the task of programming and paving the way for new applications for computers. In parallel, the first operating system concepts emerged, such as batch processing, which aimed to improve the efficiency of using precious hardware resources. The challenges in software development were considerable, but the ingenuity and determination of early programmers laid the foundation for the subsequent explosion of the software world.