Computers, processors, memories and transmission equipment are devices or machines. All these machines, as configured systems, have a documented history of addressing concrete technical problems that were difficult to overcome. Ultimately, computing machines are characterized by what they do, or by their architecture. This article illustrates some of the historical issues in developing programmed and programmable machines.
From the start of data-processing it has been clear that stored data is not the same as the data that it represents. Stored data, including instructions, are often compared to written data and thought concepts that they represent. That is incorrect. Written data, such as texts or bookkeeping records, and thought concepts cannot be processed by machines, only by humans. They also cannot control machine operations. The problem of machine processing of stored data was initially solved by early Hollerith machines that could tabulate (accumulate) signals spatially dispersed as punched holes on a sturdy paper card. These machines could not perform arithmetic. The processing was simply counting specific locations of holes on cards. What the holes mean is indifferent to the machine.
The original idea of Hollerith in 1889 uses relays to identify individual and combinations of holes on cards. The next development in the early 1900s was twofold: 1) automatic sorting of punched cards based on certain locations (like columns) of holes on the cards. 2) capability to add data represented by certain hole locations. Further innovations were: automatic feeding of cards, printing of card data and alphabetic representation of holes on cards. The punched card machines were initially fairly static in functionality.
Tabulating machines evolved into Punched Card Accounting Machines and were made configurable with plugboards to activate components (like reading brushes) to sequence different operations on the same deck of cards. Configuring a plugboard was time consuming and companies would maintain sets of configured plugboards that could be exchanged in an Accounting Machine to be used for different operations. Thus a machine with a differently configured plugboard would generate different results from the same deck of cards and was in essence a different machine if one defines a machine by its output as a result of an input.
In 1938 AIEE Transactions published an article by L.F. Woodruff of MIT entitled “A System of Electric Remote-Control Accounting.” The described system, claimed in US Patent 1,801,981, includes the collection, transmission and receiving by machines of sales data. It includes tape-generating and tape-to-punch card machines. The data is processed by tabulating machines for credit management or inventory control. The described system, all using known components, is stunningly advanced for 1938, and has a very modern architecture.
The tabulating machines processed massive amounts of punched cards by technical means. What the punched holes on a card represent is indifferent to the machine. It could be acres of land, religion or place of birth. No one watching these machines operate would say that they are directed to an abstract idea, even though tabulating data by humans was well known. A tabulating machine would easily process over 100 punch cards and later up to 600 cards per minute, a feat that could not be matched by any human.
IBM introduced in the 1930s a series of punched-card multipliers (the 600 series) which could perform multiplications. By that time technology offered configurable data-processing machines that could store and retrieve (from punched cards) data, process data (such as simple arithmetic, sorting, accumulating, and tabulating), obtain remote data, display data and generate data for decision support and all on a scale that could not be achieved by humans.
Parallel with data-processing machines and relatively independent thereof, calculating machines were developed. A breakthrough was the development of direct multiplication machines of which the Frenchman Léon Bollée is credited as one of its inventors. Rather than using repeated addition, direct multiplication applies a facility (like bars or special wheels) that represent a multiplication table for partial product generation. All mechanical calculating machines used some form of proportionality for mechanical calculation. Due to mechanical limitations, it was difficult to store intermediate results in the calculator for later processing, and thus it was physically hard or impossible for a mechanical machine to evaluate complete expressions. A breakthrough in automated calculators was Claude Shannon’s Master Thesis entitled “A symbolic analysis of relay and switching circuits” in which a binary, relays based, adder was disclosed wherein the design was made from Boolean expressions. The Shannon adder worked strictly on distinct technical switching states of which the proportional size was meaningless, as long they were different. George Stibitz of Bell Labs, who knew of Shannon’s work it seems, in 1937 built a first binary2-bits adder from 2 relays. You can see a model operated by Stibitz at 2 min.30 sec. on this video. What makes the Stibitz device a calculator, instead of merely a switching device, are the two lamps. Konrad Zuse built binary calculating equipment, including a floating-point calculator from mechanical sliding plates. Shannon, Stibitz and Zuse overcame the proportionality limitation by switching technology. Technically, binary “logic” switching based machine arithmetic is a world apart from proportional machine arithmetic.
Mathematical expressions in science and engineering consist of individual steps that require re-use of intermediate results, and calculation of different terms. Many expressions are iterative and for certain expressions it is not even certain that a result exists. Furthermore, size of representation and rounding errors may make a machine calculation incorrect or impossible. It became clear that sequenced machine arithmetic is more than programming formulas. It requires knowledge of required steps, stability of operation, and the physical limitations of machines.
The Harvard Mark 1 and the ENIAC were two machines designed to perform multi-step, scientific calculations. These steps were called machine sequences that activated different functional units (such as addition and multiplication) on data that was stored on paper-tape, punched card or electrical/electronic memory. The ENIAC contained function tables in which switches had to be manually set and a plugboard with patchcords, not unlike the known plugboard of the Accounting System tabulator. ENIAC was later modified to store an instruction counter in one of the accumulators and the function tables were modified to store instructions as numerical data. A consequence of sequential reading of instructions from storage was the loss of the hardwired parallel execution by separate units, leading to a significant slowing of the performance of the modified ENIAC.
The control of an electronic computer like ENIAC was not a trivial matter. The fastest control was by direct connection of functional units with switches and patchcords. However, time spent on (re-)configuring ENIAC could take weeks and limited the efficiency of the machine. The use of punched-paper tapes as a means for running a program was a solution for the Harvard Mark-1 which was an electro-mechanical machine. The ENIAC processing speed due to electronic components was too fast for electro-mechanical provision of system instructions. The need to store instruction in a fast memory was identified in an early stage by the designers of ENIAC. This issue also came to the fore in the design of theILLIAC, wherein it was noted that in order to provide instruction by paper tape at electronic execution speed, would require an impossible tape reading speed of about 200 miles per hour (page I-4).
It is argued by some that the stored program approach emphasizes the abstract nature of software. However, it is clear from the literature that the stored program approach is essentially a technical solution to the problem of flexible execution control of processing steps in electronics based switching to overcome the cost and inconvenience of using plugboards and has nothing to do with being abstract.
With the availability of sequenced calculators, machine arithmetic became a technical problem that urgently needed to be solved. For instance, higher order linear equations, which had great practical significance in the 1940s as they do now, could only reasonably be solved on a machine. Substantial work was done in 1946 by Von Neumann et al. in Princeton, showing that doing math on a machine is different from using paper and pencil by a human.
The IBM 650, one of the first programmable computers accessible by a wider group of users/programmers, further demonstrated that “programming” a computer is a technical effort, which was exacerbated by the use of a rotating magnetic drum as main memory. One example in the book Programming the IBM 650 (page 1) from 1958 explains fast evaluation of polynomial expressions. Even the most basic mathematical functions (such as trigonometric functions) had to be programmed, using very basic machine instructions. For anyone entering the computer world in the 1940s/1950s it was clear that a computer has inherently very few “skills” and all capabilities have to be programmed. Literally nothing that a computer does in that period is conventional or routine. To alleviate the burden of programming program libraries are established, such as the IBM 650 program library to build upon earlier work. Infuriatingly little is done “inherently,” “easily” or “by itself” by computers in their early history.
The IBM Stretch and System/360 (which are among the first modern general purpose computers) applied a specifically structured approach in design. According to two of the architects of the System/360, Gerrit Blaauw and Frederick Brooks, involved with design of both machines, a computer can be analyzed from 3 levels: 1) the architecture, or the functionality a user sees; 2) the implementation level which describes the logical structure of functional elements of the machine; and 3) the realization or the physical level of the design. The System/360 has a specific architecture. Blaauw and Brooks formalized the design approach and applied it to analyses of major computer systems in their book Computer Architecture.
In that context, a programmed and networked computer can be characterized by an architecture, different from other architectures, wherein the underlying implementation and realization can be of known components.
Recorded history shows that a computer performing an application is a technical, structured and designed device with a specific purpose and is (in historical perspective) a highly unconventional device. A programmed computer is a technical machine that places components into novel and distinguishing architectures. A computer without software is like an analog clock without hands.
Programmed computers are devices. To associate programmed computers with abstract ideas is a scientific, a technological and a historical blunder. Clearly, there is something in claiming a programmed computer that has upset the courts sufficiently to opine “judicial exceptions” which are not based on any scientific facts and are contradicted by historical facts. Technological reality and judicial opinion have now via Alice v. CLS diverged into a bizarre gap that the Supreme Court is unlikely to bridge without reversing itself completely. That is unlikely to happen and Congress should step in.