What exactly is Turing's Automatic Computing Engine

The father of the "Imitation Game": The life and work of Alan Turing

background


In 1950, the October issue of the English philosophy magazine "Mind" published a 28-page article entitled "Computing Machinery and Intelligence", which could be translated as "Computer technology and understanding". The article looked at the question of how the mental ability of a machine can be determined and suggested a test for this, the "imitation game". If the machine successfully simulates a human being in dialogue with an examiner, Turing claims, it would have to be given some degree of the ability to think. The special thing about the "Mind" article is that it was written neither by a philosopher nor by a psychologist, but by a mathematician - and a proven expert in devices that we now call computers: Alan Turing.

European philosophy tradition

Turing's name has also been associated with other concepts over the past 60 years, above all the Turing machine. Turing was an interdisciplinary thinker, he could easily transfer knowledge from one subject to another and in this way always came up with new, groundbreaking insights. “Computing Machinery and Intelligence”, for example, is now considered a basic text on the subject of artificial intelligence, that large branch of computer science that was only introduced in 1956 at a conference in the USA. Turing anticipated fundamental ideas with his considerations on the "imitation game" as early as the 1940s. The premise of the imitation game differed significantly from the European tradition of philosophy, which placed the spiritual in the head. Turing viewed thinking as a phenomenon that can be read “from the outside” in a person and his or her behavior. His unconventional point of view led Turing to unusual solutions that other mathematicians and philosophers had previously disregarded.

Alan Turing (© Heinz Nixdorf MuseumsForum)
Turing had studied mathematics at Cambridge University in the 1930s, but he was particularly interested in mathematical logic. While arithmetic and geometry deal with numbers and sizes or figures and spaces, logic, in simple terms, deals with statements and their connections. Logic is more abstract and “purer” than mathematics, without whose formulas scientists and technicians could not work. In some fields of knowledge - such as mathematics or cryptology, the science of secret messages and their decryption - logic is indispensable.

To this day, Turing's most important essay, "On Computable Numbers, with an Application to the Decision-Making Problem", is part of basic mathematical research. In the text he examined the question of how the process of calculating or calculating can be precisely defined. Intuitively, one would say: There is a formula or a sequence of instructions, also known as an algorithm, in which start values ​​are inserted, from which a final value can be determined according to the rules of arithmetic. So: 1 + 1 = 2. But logicians like it exactly. True to his unconventional approach, Alan Turing did not invent a new rule for what computing is or should be, but a machine as a model for the various actions that take place during a certain computing process.

Model of an early computer

This device, which he called "a-machine", was a theoretical construction, but Turing most likely had a mechanical design in mind. The "a-machine" was the basis of the universal Turing machine, with which Turing then solved the decision problem of logic mentioned in the title. Put simply, it expresses the question of whether certain linguistic or mathematical statements are to be classified as generally valid (that is, as fundamentally true). On this basis, Turing developed the model of a machine that processes commands step by step and, years before the first operational electronic computers, represented the abstract model of a computer: The tape of the "a-machine" corresponds to the memory that contains both the program and the numerical inputs . The rest of the machine serves as a computing and control unit.

Turing's role in analyzing the German Enigma cipher machine during World War II was not known until the 1970s. Here Turing worked on the development of a special electrical computer, the "Turing bomb", which simulated the settings of the Enigma. He took over the basic idea of ​​the computer from a team of Polish cryptologists. What was new and typical for Turing, however, was his approach of placing the "bomb" on German words that are very likely to appear in the radio messages encrypted by the Enigma. After 1945 he worked in London on the development of an electronic computer, the "Automatic Computing Engine" (ACE). A large part of his fame as a pioneer of the computer is based on this, but he was only able to implement his concepts to a small extent and left the project prematurely. In 1948 he took over a teaching position at the University of Manchester.

At that time, Alan Turing was mainly concerned with the problem of artificial intelligence. He was the first to fill the terms “thinking machines” and “electronic brains”, which were used early on, with content, for example in the essay “Intelligent Machinery” from 1948. Here he anticipated the idea of ​​neural networks. In computing systems, he saw an equivalent to brain cells and the nerve fibers that connect them. However, nothing remained of the neurological approach in his contribution to the imitation game that appeared two years later. Turing's hypothesis that machines can think, however, quickly found practical application: In Manchester he wrote one of the first chess programs. From 1952 on, Turing was primarily concerned with biology and the structure formation of cells. He started from the assumption that even highly complex and at first glance diffuse processes can be described by mathematical formulas, since they obey laws. His most important text on this topic is "The Chemical Basis of Morphogenesis" from 1952. There he showed how the interaction of two substances, an "activator" and an "inhibitor", leads to regular patterns and manifestations according to mathematical rules. Today such a process is called the Turing mechanism.

If you look at his ideas and their effect on posterity, the Turing machine probably exerted the greatest influence. With it, he almost single-handedly created his own branch of computer science and logic: the theory of predictability. The universal Turing machine makes it one of the fathers of the computer alongside Konrad Zuse, John von Neumann and the Americans John Presper Eckert and John William Mauchly, even if its direct influence on hardware development was minimal. Turing left behind a wealth of ideas on logic, mathematics, computer science and biology for posterity. His legacy is the questions he asked. And the problems that still need to be solved.

Author: Ralf Bülow, journalist, science historian and exhibition designer, January 19, 2015


The text is licensed under the Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Germany License.