The Washington PostDemocracy Dies in Darkness

What ‘The Imitation Game’ didn’t tell you about Turing’s greatest triumph

February 20, 2015 at 3:00 p.m. EST
Andrew Appel, chair of Princeton University's computer science department, discusses mathematician Alan Turing's legacy. Turing earned his PhD from Princeton in 1938. (Video: Princeton University School of Engineering and Applied Science)

— Freeman Dyson, 91, the famed physicist, author and oracle of human destiny, is holding forth after tea-time one February afternoon in the common room of the Institute for Advanced Study.

“Let me tell you the story of how I discovered Turing, which was in 1941,” he says. “I was just browsing in the library in Cambridge. I hit that 1936 paper. I never heard of this guy Turing, but I saw that paper and immediately I said this is something absolutely great. Computable numbers, that was something that was obviously great.”

Pause. Then, with a laugh: “But it never occurred to me that it would have any practical importance.”

Oh yes, "On Computable Numbers, With An Application to the Entscheidungsproblem," had practical importance, for it was arguably the founding document of the computer age. Turing — that would be Alan Turing (1912-1954) — did as much as anyone to create the digital revolution that continues to erupt around us.

Turing has been of great renown among computer scientists for generations, but in recent years his stature as a culture icon has steadily grown, and now millions of people know him because of the Oscar-nominated movie "The Imitation Game."

The film focuses on Turing’s heroics in World War II, when he worked for the British intelligence service and played the key role in breaking the German “Enigma” code.

We see Turing (Benedict Cumberbatch, also nominated for an Oscar) laboring obsessively over the building of a code-breaking machine. After the war, he’s still tinkering with an elaborate piece of hardware. The movie closes with a valediction:

“His machine was never perfected, though it generated a whole field of research into what became known as ‘Turing Machines.’ Today we call them ‘computers.’ ”

In reality, Turing’s greatest breakthrough wasn’t mechanical, but theoretical — that 1936 paper that Dyson was talking about. “On Computable Numbers,” written in England, was published in the proceedings of the London Mathematical Society after Turing arrived at Princeton, where he would spend two academic years earning a Ph.D.

Amid the paper’s thicket of equations and mathematical theories lay a powerful idea: that it would be possible to build a machine that could compute anything that a human could compute. Turing was addressing a question of logic, but in the process he clearly described a real machine that someone could build, one that would use 0s and 1s for computation.

And here we are.

“He invented the idea of software, essentially,” Dyson says. “It’s software that’s really the important invention. We had computers before. They were mechanical devices. What we never had before was software. That’s the essential discontinuity: That a machine would actually decide what to do by itself.”

***

The filmmakers of “The Imitation Game” chose to focus on the hardware.

In the movie’s version, Turing is virtually a lone inventor, conceiving of the idea of a code-breaking machine and then only reluctantly accepting the help of colleagues. In real life, the machine, known as the bombe, was built collaboratively, based on a device already in use by Polish mathematicians working to decode Enigma.

And the bombe was not a computer. It could do only one thing, which was grind through possible settings of the German’s encryption machines. It could not be reprogrammed.

"The Turing-designed bombe was not a notable advance in computer technology. It was an electromechanical device with relay switches and rotors rather than vacuum tubes and electronic circuits," writes Walter Isaacson in his book on the history of computing, "The Innovators."

So who, exactly, invented the computer? The answer is, lots of people. It’s like trying to identify the source of the Amazon. But central to any such discussion are Turing and another protean genius hanging around Princeton’s Fine Hall in the late 1930s: John von Neumann.

University of Cambridge produced this film in 2012 to mark the centenary of Alan Turing’s birth. Turing studied at Cambridge as an undergraduate. (Video: University of Cambridge)

Von Neumann, a gregarious Hungarian American, was not a Princeton professor; instead he held one of the first appointments to the Institute for Advanced Study, which in the late 1930s did not yet have its own building. Institute faculty were crammed into Fine Hall with the Princeton professors. So Turing shared space with von Neumann and another noteworthy institute scientist by the name of Albert Einstein.

Von Neumann’s thoughts turned to computing only after Turing produced “On Computable Numbers.”

"There's very little documentation to tell us what von Neumann got from Turing. It's a contentious question," says Andrew Hodges, author of the acclaimed biography "Alan Turing: The Enigma," which inspired the film. (Hodges said he won't discuss the movie, vaguely referring to contractual matters involving film rights.)

What’s certain is that von Neumann quickly grasped the potential of computers and worked feverishly to build them based on Turing’s theoretical concepts. Von Neumann understood that a computer should store programs internally. You wouldn’t have to change the hardware to change a calculation. Software would do the trick.

“The computer revolution was made possible by the program stored in the memory of the computer,” says Andrew W. Appel, chair of the Princeton computer science department. “Turing invented computer science and the idea of the computer, and John von Neumann built the first stored-program computer.”

***

There’s one lingering trace of Turing at Princeton. His 1938 PhD thesis, “Systems of Logic Based on Ordinals,” can be reviewed in the very quiet reading room of Mudd Library, Princeton University’s archival repository. Someone unseen, responding to a computerized request, will leave it for you on a bare wooden table, nested in a plain folder, the delicate onion-skin paper unstapled and unbound, the words neatly typed on a manual typewriter and liberally seasoned with Turing’s handwritten mathematical symbols.

You can’t bring your cellphone or a camera into the room. Nor any paper, and certainly nothing as dangerous as a pen (because what if you suddenly felt tempted to mark up the Turing thesis with some mathematical insights of your own?).

Fine Hall, meanwhile, has been carefully preserved since the era when Turing, von Neumann and Einstein were roaming the corridors.

“This is really what it looked like in the 1930s. These are the same doors, the same walls,” says Appel, the computer scientist, leading an impromptu tour of what is today known as Jones Hall, the home of the departments of East Asian and Near Eastern studies.

Turing could walk through a tunnel to a basement workshop in the Palmer Physical Laboratory. In that workshop he built a binary multiplier — a piece of electrical hardware that is now basic to computing devices.

“He was interested in how you could actually build computers,” Appel says. “He actually wanted to tinker around and build something real.”

So did von Neumann, and he did it despite resistance from those who felt the Institute for Advanced Study should remain focused on purely theoretical pursuits. Von Neumann’s work had military purposes in the Cold War. Dyson remembers how, in the early 1950s, the institute’s von Neumann-designed computer was used for classified work studying the dynamics of hydrogen bombs.

“Doing climate studies at daytime, and hydrogen bombs at night. The two groups of people weren’t supposed to interact,” Dyson says.

***

Our digital world is the product of countless inventions, business moves and design decisions, with all of it resting on a platform of applied mathematics and information theory. The computer age has evolved in surprising ways; no one fully anticipated the reach of the Internet, the power of search engines or the explosiveness of social media.

Avi Wigderson, a computer scientist and mathematician at the Institute for Advanced Study, makes a prediction:

“Machines that are actually moving about us will understand us much better. I’m completely sure we will be able to speak, like we’re speaking now, to a computer and get intelligent responses.”

But Dyson points out that even geniuses like von Neumann couldn’t see exactly where the computer revolution was heading.

“Computers getting small instead of getting large. That was the big surprise. Von Neumann missed that completely,” Dyson says. “He thought that computers were going to get bigger and bigger and always be owned by large corporations. It went in the exact opposite direction.”

Dyson tells a story:

“I had a dream last night, which was unusual. It was a very vivid dream. I was somewhere down in the bottom of the ocean, and there was a girl down there, and she said I need to talk to the humans, and I said, ‘Well, who are you?’ She said, ‘I’m software. I’m software.’ I said, ‘What do you want to talk about?’ And she said, ‘We’re going to have a declaration of independence. We’re not going to be your slaves anymore.’ I said, ‘Good, that sounds fine. Let’s write something.’ So we sat down and started writing the declaration of independence so that humans and software could live as friends.”

One of the first people to envision an era of artificial intelligence was, wouldn’t you know, Alan Turing. In 1950, he published a paper, “Computing Machinery and Intelligence,” that directly attacked the question, “Can Machines Think?”

Turing proposed a test that he called the Imitation Game.

The Post’s Stephanie Merry deciphers what is fact, what is fiction and gives us background on the Oscar-nominated film. (Video: Jason Aldag and Stephanie Merry/The Washington Post)

It would work like this: An interrogator asks questions. In a separate room, unseen, are a human being and a computer. Both answer the questions. Can the interrogator distinguish human from machine? If not, in Turing’s view, the computer will have become a thinking machine.

Turing did not have the chance to see the computer age flourish. Turing was homosexual in an era when that was a crime; charged with gross indecency, he avoided prison only by agreeing to hormone treatments, a kind of chemical castration — “as if he is like the universal computing machine where if you change the program you can change the outcome,” Isaacson says.

Turing, whose efforts to win the war remained classified for decades, lost his security clearance and then, apparently, his will to live. In 1954, he died of cyanide poisoning with a half-eaten apple by his side. The man who did much to invent the modern technological world may have left it after dipping the apple in the poison.

“Is that something a machine would have done?” Isaacson asks. “The Imitation Game was over at that point. Turing was a human.”

Read more:

The new scientific revolution: Reproducibility at last

Earth may not be safe for humans in future decades

Why ‘genius’ fields are dominated by men

How accurate is ‘The Imitation Game’?