Computer Programming

20th Century Technology Series

The Twentieth Century Technology Series was planned as an opportunity to look back and reflect on the key developments in engineering and technology and their impact on society and the economy. The five speakers were asked to explain the basic principles behind their respective technology and describe how the technology evolved bringing it up to the present day, then speculate on technological prospects after the year 2000.

Victor Suchar, Series Organiser

Computer Programming

Andrew Pepperdine, Member, on 17 June 1999

Computers have become ubiquitous in the last 50 years, but it is still not easy to make them do what we want. This talk attempts to show what

computer programming entails and why it is so difficult to get things right.

The original idea of a computer was introduced by Charles Babbage in his Analytical Engine in 1835. All of the necessary features are there: the memory containing intermediate results, the program containing the calculation to be performed, also stored in the memory, and the central processing unit (CPU) which performed the calculations under program control. These same essential elements are still present in all of today's machines.

Since the hardware is composed of detailed circuitry to manipulate very small elements of logic, the instructions in the stored program are not much more complex. They typically consist of commands to fetch a value from memory, perform a simple operation (e.g. add a constant value), and store the result somewhere. To make this sort of machine do anything useful requires it to execute a very large number (in the millions) of such instructions. Samples taken in the past of large mainframes in real use have shown that they may be spending 30% of their time branching from one place to another. If one includes the instructions required to make the decision on where to branch to, then it is clear that very little time is spent actually doing `real work', but instead it is deciding what work to do. These decisions all have to be put into the program, and this is where the difficulty appears when writing programs.

From about 1945 to 1960, the foundations of the current technology for program construction were laid down. The fundamental idea is that we need an `easy' way to represent the logic we wish to carry out, and then get the machine itself to translate the human-understandable text into the set of instructions which is the machine's program. This process is known as `compiling' and the program that performs such a translation is a `compiler', and can contain half a million lines of program text.

The next stage was to enable several programs to exist in a machine at one and the same time. This led to the invention of operating systems, that is, programs that manage the resources of the machine and distribute them among several application programs running on the cpu. A special case of this is the cpu itself, which can execute only one program at a time. When the operating system has decided which application should have control of the cpu, it relinquishes control of the cpu to it. The cpu will revert to the operating system when something important happens, like an external event interrupts the system, or the amount of time allowed for the application expires.

The operating system also manages the memory, and can prevent one application destroying the contents of another's memory by accident (or design in the case of a virus); although there are some modern operating systems that have not utilised this possibility to the full capability of the modern CPUs.

As applications got larger, they had to be broken down into multiple units called modules which are within the bounds of people's understanding. This process in turn demands a method of defining the environment in which a module executes, or the parameters it manipulates. There has been little progress in this area since the 1960s, and the lowest levels of system programs are still written using these older techniques.

Newer styles of programs have appeared, like spreadsheets and data base engines, but these are suitable only for a restricted subset of applications in each case. Most of the tricky algorithms and systems are not helped by these later developments.

Very large suites of applications need large amounts of time to analyse and design the appropriate programs and modules. When these are late, it is most often due to a lack of understanding on the part of the management of the technical difficulties inherent in the request. In many cases, the job being addressed is more complex than is necessary at the time. It is usually easier to evolve from a smaller set of applications by implementing human procedures electronically step by step until an adequate system is developed, but too often, in my opinion, the aims are too ambitious before they are understood and can be assessed.

Andy Pepperdine