Turing in Context

See allHide authors and affiliations

Science  29 Jun 2012:
Vol. 336, Issue 6089, pp. 1638-1639
DOI: 10.1126/science.336.6089.1638-c

In his Perspective “Beyond Turing's machines” (13 April, p. 163), A. Hodges claims that in 1945 Turing “used his wartime technological knowledge to design a first digital computer.” He also suggests that Turing's work of 1936 laid the foundation for encoding “all known processes,” going “far beyond the vision of others at the time.” These statements implicitly diminish the earlier work of Kurt Gödel and Konrad Zuse. In 1931, Gödel used the integers to design a universal language capable of encoding arbitrary computations and general algorithms that could prove theorems (1). This allowed him to identify the fundamental limits of math and provability. Turing and his adviser Alonzo Church later merely reformulated Gödel's work in an elegant way. Furthermore, Zuse's 1936 patent application Z23139/GMD Nr. 005/021 already described a concrete general computer, as opposed to a purely mathematical construct. By 1941, Zuse had physically built the first working universal digital machine, years ahead of anybody else [e.g., (2, 3)]. Thus, unlike Turing, he not only had a theoretical model but actual working hardware. Future hardware leader IBM was well aware of Zuse's breakthroughs and acquired an option on his patents at the earliest possible point after the war (4). The great computer science hero Turing surely deserves center stage on his centenary. But let's not exaggerate his achievements at the expense of others!


Navigate This Article