The Wall Street Journal
March 3, 2012
BOOKSHELF
"Turing's Cathedral" by George Dyson, Pantheon, 29.95, 401 pages
"The Nucleus of the Digital Age"
In pursuit of hydrogen bombs, a math genius and a brilliant tinkererin Princeton developed the modern computer
by Konstantin Kakaes
The mathematician John von Neumann, born Neumann Janos in Budapest in 1903, was incomparably intelligent, so bright that, the Nobel Prize-winning physicist Eugene Wigner would say, "only he was fully awake." One night in early 1945, von Neumann woke up and told his wife, Klari, that "what we are creating now is a monster whose influence is going to change history, provided there is any history left. Yet it would be impossible not to see it through." Von Neumann was creating one of the first computers, in order to build nuclear weapons. But, Klari said, it was the computers that scared him the most.
Von Neumann had come to New Jersey from Hungary in 1931, toward the beginning of the great scientific exodus from Europe. A decade and a half later, with the U.S. military frantically studying whether a hydrogen bomb thousands of times more powerful than the one dropped on Hiroshima was possible, von Neumann got funding to build a computer—the first that could store its program in memory.
This ability is what distinguishes a computer from a calculator. A computer that can only execute a program fixed in solder and wire is, in essence, a very fast calculator. (The first incarnation of the University of Pennsylvania's ENIAC, which slightly predated von Neumann's project, was one such device.) A computer that can store (and thus modify) its own program, by contrast, can readily be adapted to different tasks. In fact, it can solve any problem one can put to it, given enough time. This universal power was what terrified—and thrilled—von Neumann; its theoretical underpinnings were the work of the British mathematician Alan Turing.
In the 1930s, Turing and Kurt Gödel had shaken the world of mathematics just as thoroughly as Albert Einstein and a cohort of quantum physicists had upended Newton's physics in the two decades before. Gödel proved that no logical system that included our basic idea of arithmetic was complete. This means that you can always come up with a statement that is neither true nor false within the system. Turing then showed that there would always be statements for which you couldn't even know whether you'd ever be able to figure out if they were true, false or undecidable. In working out his proof, Turing devised a kind of notional machine that did not yet exist—computers, essentially.
Though Turing lends his name to the title of George Dyson's detailed and engaging "Turing's Cathedral," it is von Neumann who is the book's central figure. Mr. Dyson focuses on the story of the men and women that von Neumann brought to the Institute for Advanced Study, in Princeton, to create the "Turing machine" that Mr. Dyson argues is the nucleus of today's digital universe.
Many groups were trying to build such a machine at the same time, driven in large part by two surprisingly similar problems: designing nuclear weapons and breaking codes. Most of them would end up copying the "Mathematical Analyzer, Numerical Integrator, and Computer," or MANIAC—the machine that von Neumann's team created in five feverish years, from 1946 to 1951. (It is more commonly known as the "IAS machine," because the director of the Institute thought the name MANIAC too colloquial—indeed, it had begun as the nickname frustrated engineers gave the ENIAC.) Within a few years it would spawn another MANIAC in Los Alamos; ILLIAC, in Illinois; BESK in Scandinavia, BESM in Moscow; and JOHNNIAC in Los Angeles, among others. Most important, IBM in 1952 would copy the MANIAC's design for the 701, the first commercially successful computer.
Mr. Dyson tells his story as a sort of intellectual caper film. He gathers his cast of characters from places including Central Europe (von Neumann and Klari from Budapest; Stanislaw Ulam, von Neumann's best friend, from Poland via Harvard and Wisconsin) and MIT (the electrical engineer Julian Bigelow) and tracks their journey to Princeton. When they converge, it's great fun, despite postwar food rationing and housing shortages.
To house the influx of scientists and engineers, Bigelow himself dismantled some houses in upstate New York and moved them to Princeton by rail. At one point, the director of the Institute complained to von Neumann that the project members had been "especially unfair in the matter of sugar" and suggested that they "have their tea here under proper supervision." When Dylan Thomas passed through town, a young mathematician in the group woke him in the middle of the night and brought him to the team's quarters to carouse.
When they weren't partying, they were working, spurred by von Neumann's "unshakable confidence that said 'go ahead, nothing else matters, get it running at this speed and capability, and the rest of it is just nonsense,' " as Bigelow put it. They got a big jump on the other groups because, as Bigelow noted, they never tried to produce "any new elementary components": The MANIAC was built from vacuum tubes that were already being mass-produced for use as amplifiers in communications equipment.
The MANIAC became operational in stages. The first draft of its design was completed on June 28, 1946. It described in detail the 39 steps needed to multiply two numbers, a feat that the machine finally achieved in April 1948, nine months after it had begun to add. The computer could perform 70,000 multiplications per second—stupendously fast compared with its predecessors and glacial by today's standards, when the average laptop can do tens of billions of multiplications per second. But the processors were still without memory, until Bigelow and a colleague "saw how to convert cheap, off-the-shelf oscilloscope tubes into random-access memory."
Bigelow's talent was for such tinkering and troubleshooting. Von Neumann would write a famous paper on how to construct a reliable machine from unreliable components, but it was Bigelow who actually did the constructing. At the age of 3 he had removed all the doorknobs in his aunt's house and collected them in a pile; for the MANIAC, he built an input/output device using two bicycle wheels that coiled and uncoiled steel wire at 100 feet per second. Much like a cassette tape, the wire stored bits as a string of magnetic variations.
By mid-1951, the machine was fully up and running. The entire assemblage weighed about 1,000 pounds and "resembled a turbo-charged V-40 engine, about 6 feet high, 2 feet wide, and 8 feet long," Mr. Dyson writes. Each side of the V had 20 cylinders, with a memory tube in each. Except for its last couple of years, when reliability problems and a lack of demand meant it frequently lay fallow, the machine would stay in use around the clock for most of a decade.
It wasn't only that the MANIAC was more powerful than its predecessors—it was 20 times faster than ENIAC; it could also be programmed for different tasks. The MANIAC's chief job was simulating nuclear explosions. Slack time was taken up by early simulations of genetic evolution and of the life cycle of stars. The MANIAC was also put to use to make both short-term weather predictions and longer-range climate models—the sort of thing that fed von Neumann's grave predictions about the power of computing: "After global climate control becomes possible," he argued, "this will merge each nation's affairs with those of every other, more thoroughly than the threat of a nuclear or any other war may have already done."
Mr. Dyson's central concern is to show how a collection of vacuum tubes turned into something that could process information in a fundamentally new way. He offers enough technical detail to convey the spirit of the challenge, but his determination to keep the book accessible to the layman means that there is not enough for the more expert reader to get a step-by-step sense of what the computer did.
Mr. Dyson is rightly as concerned with the machine's inventors as with the technology itself. He spent years at the Institute going through archives that had not been reviewed since 1946, as well as reading Bigelow's private papers, and he interviewed several dozen of the surviving eyewitnesses to the project. Mr. Dyson himself can remember playing amid the ruins of the computer as a child, having grown up around the Institute, where his father, the physicist Freeman Dyson, has been a professor since 1953.
But his attention to biography betrays him structurally, such as when he traces von Neumann's travels around Europe in pursuit of Klari, his wife to be. It's an engaging tale, but it takes the reader away from the chronology of the machine in Princeton. There are other casualties of his approach. By focusing on the MANIAC, he neglects the precursors that made it possible. And though his eye for detail is so keen that at one point he (somewhat tediously) gives us the office numbers of the Institute's secretaries and professors, we are never really given a sharp portrait of the key room with Bigelow's computer as it takes shape.
Nor does Mr. Dyson make more than brief mention of the Manchester Small Scale Experimental Machine (SSEM), which scooped the MANIAC as the first working stored-program computer. The SSEM was the offspring of the Colossus, which the British had used during the war to break German military codes. Alan Turing himself had played a key role in this effort, having returned to England in 1938 after spending two years getting his Ph.D. at Princeton (a period vividly described by Mr. Dyson, including the detail that Turing arrived in New Jersey carrying, in retro affectation, a sextant). The SSEM's design, however, proved less influential on later computers than the MANIAC's, in part, says Mr. Dyson, because of Britain's restrictive Official Secrets Act, which hobbled postwar research. In a couple of brief excursions, Mr. Dyson looks forward to the present day. He notes in passing that, though all the computers humans have built (except for some rudimentary quantum computers) are logically equivalent to Turing's original machine, once they are wired together into the Internet, and once they accept input from human brains (as when a click on a particular link gives Google's algorithm feedback about which search result was most relevant), they begin to form something more. Observing that "our understanding of technology has diminished as machines approach the complexity of living things," Mr. Dyson speculates that "the behavior of a search engine . . . resembles the activity of a dreaming brain."
After von Neumann died in February 1957, the Princeton project was sapped of its momentum. Though the engineers wanted to continue developing another machine, the Institute's bias toward the theoretical won out, and the engineers began to leave. On July 15, 1958, Bigelow turned the computer off. Within four years, it was in the Smithsonian. That first machine had a memory of about five kilobytes; as Mr. Dyson points out, this is scarcely enough to represent the data in a single icon today. And yet, in translating between "coded sequences in time and arrays of charge in space," it did fundamentally the same thing that millions of computers do on a vastly larger scale today. Modern digital computing is too complicated a thing to be traced to a single moment of divine conception. But Mr. Dyson's chronicle, if too limited in scope to be a definitive history of the computer, is a well-told chapter of that larger story.
—Mr. Kakaes is a Bernard L. Schwartz Fellow at the New America Foundation.
Copyright 2012 Dow Jones & Company, Inc. All Rights Reserved
This copy is for your personal, non-commercial use only. Distribution and use of this material are governed by our Subscriber Agreement and by copyright law. For non-personal use or to order multiple copies, please contact Dow Jones Reprints at 1-800-843-0008 or visit www.djreprints.com
Share with your friends: |