1953: The Year That Revolutionized Life, Death, and the Digital Bit

By George Dyson

Three technological eras began in 1953: thermonuclear weapons, stored-program computers, and modern genetics.


At 10:38 p.m. on March 3, 1953, in a one-story brick building at the end of Olden Lane in Princeton, New Jersey, Italian Norwegian mathematical biologist Nils Aall Barricelli inoculated a 5-kilobyte digital universe with random numbers generated by drawing playing cards from a shuffled deck. "A series of numerical experiments are being made with the aim of verifying the possibility of an evolution similar to that of living organisms taking place in an artificially created universe," he announced.

A digital universe -- whether 5 kilobytes or the entire Internet -- consists of two species of bits: differences in space, and differences in time. Digital computers translate between these two forms of information -- structure and sequence -- according to definite rules. Bits that are embodied as structure (varying in space, invariant across time) we perceive as memory, and bits that are embodied as sequence (varying in time, invariant across space) we perceive as code. Gates are the intersections where bits span both worlds at the moments of transition from one instant to the next.

The term bit (the contraction, by 40 bits, of "binary digit") was coined by statistician John W. Tukey shortly after he joined von Neumann's project in November of 1945. The existence of a fundamental unit of communicable information, representing a single distinction between two alternatives, was defined rigorously by information theorist Claude Shannon in his then-secret Mathematical Theory of Cryptography of 1945, expanded into his Mathematical Theory of Communication of 1948. "Any difference that makes a difference" is how cybernetician Gregory Bateson translated Shannon's definition into informal terms. To a digital computer, the only difference that makes a difference is the difference between a zero and a one.

That two symbols were sufficient for encoding all communication had been established by Francis Bacon in 1623. "The transposition of two Letters by five placeings will be sufficient for 32 Differences [and] by this Art a way is opened, whereby a man may expresse and signifie the intentions of his minde, at any distance of place, by objects ... capable of a twofold difference onely," he wrote, before giving examples of how such binary coding could be conveyed at the speed of paper, the speed of sound, or the speed of light.

That zero and one were sufficient for logic as well as arithmetic was established by Gottfried Wilhelm Leibniz in 1679, following the lead given by Thomas Hobbes in his Computation, or Logique of 1656. "By Ratiocination, I mean computation," Hobbes had announced. "Now to compute, is either to collect the sum of many things that are added together, or to know what remains when one thing is taken out of another. Ratiocination, therefore is the same with Addition or Substraction; and if any man adde Multiplication and Division, I will not be against it, seeing ... that all Ratiocination is comprehended in these two operations of the minde." The new computer, for all its powers, was nothing more than a very fast adding machine, with a memory of 40,960 bits.

In March of 1953 there were 53 kilobytes of high-speed random-access memory on planet Earth.

In March of 1953 there were 53 kilobytes of high-speed random-access memory on planet Earth. Five kilobytes were at the end of Olden Lane, 32 kilobytes were divided among the eight completed clones of the Institute for Advanced Study's computer, and 16 kilobytes were unevenly distributed across a half dozen other machines. Data, and the few rudimentary programs that existed, were exchanged at the speed of punched cards and paper tape. Each island in the new archipelago constituted a universe unto itself.

In 1936, logician Alan Turing had formalized the powers (and limitations) of digital computers by giving a precise description of a class of devices (including an obedient human being) that could read, write, remember, and erase marks on an unbounded supply of tape. These "Turing machines" were able to translate, in both directions, between bits embodied as structure (in space) and bits encoded as sequences (in time). Turing then demonstrated the existence of a Universal Computing Machine that, given sufficient time, sufficient tape, and a precise description, could emulate the behavior of any other computing machine. The results are independent of whether the instructions are executed by tennis balls or electrons, and whether the memory is stored in semiconductors or on paper tape. "Being digital should be of more interest than being electronic," Turing pointed out.

Von Neumann set out to build a Universal Turing Machine that would operate at electronic speeds. At its core was a 32-by-32-by-40-bit matrix of high-speed random-access memory -- the nucleus of all things digital ever since. "Random access" meant that all individual memory locations -- collectively constituting the machine's internal "state of mind" -- were equally accessible at any time. "High speed" meant that the memory was accessible at the speed of light, not the speed of sound. It was the removal of this constraint that unleashed the powers of Turing's otherwise impractical Universal Machine.

Electronic components were widely available in 1945, but digital behavior was the exception to the rule. Images were televised by scanning them into lines, not breaking them into bits. Radar delivered an analog display of echoes returned by the continuous sweep of a microwave beam. Hi-fi systems filled postwar living rooms with the warmth of analog recordings pressed into vinyl without any losses to digital approximation being introduced. Digital technologies -- Teletype, Morse code, punched card accounting machines -- were perceived as antiquated, low-fidelity, and slow. Analog ruled the world.

The IAS group achieved a fully electronic random-access memory by adapting analog cathode-ray oscilloscope tubes -- evacuated glass envelopes about the size and shape of a champagne bottle, but with walls as thin as a champagne flute's. The wide end of each tube formed a circular screen with a fluorescent internal coating, and at the narrow end was a high-voltage gun emitting a stream of electrons whose aim could be deflected by a two-axis electromagnetic field. The cathode-ray tube (CRT) was a form of analog computer: varying the voltages to the deflection coils varied the path traced by the electron beam. The CRT, especially in its incarnation as an oscilloscope, could be used to add, subtract, multiply, and divide signals -- the results being displayed directly as a function of the amplitude of the deflection and its frequency in time. From these analog beginnings, the digital universe took form.

Applying what they had learned in the radar, cryptographic, and antiaircraft fire-control business during the war, von Neumann's engineers took pulse-coded control of the deflection circuits and partitioned the face of the tube into a 32-by-32 array of numerically addressable locations that could be individually targeted by the electron beam. Because the resulting electric charge lingered on the coated glass surface for a fraction of a second and could be periodically refreshed, each 5-inch-diameter tube could store 1,024 bits of information, with the state of any specified location accessible at any time. The transition from analog to digital had begun.

The IAS computer incorporated forty cathode-ray memory tubes, with memory addresses assigned as if a desk clerk were handing out similar room numbers to forty guests at a time in a forty-floor hotel. Codes proliferated within this universe by taking advantage of the architectural principle that a pair of 5-bit coordinates (25= 32) uniquely identified one of 1,024 memory locations containing a string (or "word") of 40 bits. In 24 microseconds, any specified 40-bit string of code could be retrieved. These 40 bits could include not only data (numbers that mean things) but also executable instructions (numbers that do things) -- including instructions to modify the existing instructions, or transfer control to another location and follow new instructions from there.

Since a 10-bit order code, combined with 10 bits specifying a memory address, returned a string of 40 bits, the result was a chain reaction analogous to the two-for-one fission of neutrons within the core of an atomic bomb. All hell broke loose as a result. Random-access memory gave the world of machines access to the powers of numbers -- and gave the world of numbers access to the powers of machines.

* * *

The computer building's plain concrete-block core had been paid for jointly by the U.S. Army's Ordnance Department and the U.S. Atomic Energy Commission (AEC). To reconcile the terms of the government contract, specifying a temporary structure, with the sentiments of the neighboring community, the Institute for Advanced Study had paid the additional $9,000 (equivalent to about $100,000 today) to finish the building with a brick veneer.

There were close ties between the IAS and the AEC. J. Robert Oppenheimer was director of the IAS and chairman of the General Advisory Committee of the AEC. Lewis Strauss was chairman of the AEC and president of the IAS Board of Trustees. The freewheeling mix of science and weaponeering that had thrived at Los Alamos during the war had been transplanted to Princeton under the sponsorship of the AEC. "The Army contract provides for general supervision by the Ballistic Research Laboratory of the Army," it was noted on November 1, 1949, "whereas the AEC provides for supervision by von Neumann." As long as the computer was available for weapons calculations, von Neumann could spend the remaining machine time as he pleased.

In 1953, Robert Oppenheimer and Lewis Strauss -- who had engineered Oppenheimer's appointment as director of the Institute in 1947, and would turn against him in 1954 -- were still on friendly terms. "There is a case of Chateau Lascombes waiting for you with my compliments at Sherry Wine & Spirits Co., 679 Madison Avenue (near 61st Street)," Strauss informed Oppenheimer on April 10, 1953. "I hope you and Kitty will like it."

"We picked up the wine two days ago, and opened a bottle that night," Oppenheimer replied on April 22. "It was very good; and now Kitty and I can thank you, not merely for your kindness, but for the great pleasure that you have made us." Robert and Kitty had drunk from the poisoned chalice. One year later, the man who had done so much to deliver the powers of atomic energy into the hands of the U.S. government, but had then turned against his masters to oppose the development of the hydrogen bomb, would be stripped of his security clearances after a dramatic hearing before the Atomic Energy Commission's Personnel Security Board.

While the computer was still under construction, a small team from Los Alamos, led by Nicholas Metropolis and Stanley Frankel, quietly took up residence at the Institute. There were two separate classes of membership at the IAS: permanent members, who were appointed for life by a decision of the faculty as a whole, and visiting members, who were invited by the individual schools, usually for one year or less. Metropolis and Frankel did not belong to either group and mysteriously just showed up. "All I was told was that what Metropolis came out for was to calculate the feasibility of a fusion bomb," remembers Jack Rosenberg, an engineer who had designed, built, and installed a hi-fi audio system in Albert Einstein's house for his seventieth birthday in 1949, using some of the computer project's spare vacuum tubes and other parts. "That's all I knew. And then I felt dirty. And Einstein said 'that's exactly what I thought they were going to use it for.' He was way ahead."

The new machine was christened MANIAC (Mathematical and Numerical Integrator and Computer) and put to its first test, during the summer of 1951, with a thermonuclear calculation that ran for sixty days nonstop. The results were confirmed by two huge explosions in the South Pacific: Ivy Mike, yielding the equivalent of 10.4 million tons of TNT at Enewetak on November 1, 1952, and Castle Bravo, yielding 15 megatons at Bikini on February 28, 1954.

The year 1953 was one of frenzied preparations in between. Of the eleven nuclear tests, yielding a total of 252 kilotons, conducted at the Nevada Test Site in 1953, most were concerned not with trying to make large, spectacular explosions, but with understanding how the effects of more modest nuclear explosions could be tailored to trigger a thermonuclear reaction resulting in a deliverable hydrogen bomb.

Ivy Mike, fueled by 82 tons of liquid deuterium, cooled to minus 250 degrees in a tank the size of a railroad car, demonstrated a proof of principle, whereas Castle Bravo, fueled by solid lithium deuteride, represented a deployable weapon that could be delivered, in hours, by a B- 52. It was von Neumann, in early 1953, who pointed out to the air force that rockets were getting larger, while hydrogen bombs were getting smaller. Delivery in minutes would be next.

The Americans had smaller bombs, but the Russians had larger rockets. Plotting the increasing size of rockets against the decreasing size of warheads, von Neumann showed that the intersection resulting in an intercontinental ballistic missile -- a possibility he referred to as "nuclear weapons in their expected most vicious form" -- might occur in the Soviet Union first. The air force, pushed by Trevor Gardner and Bernard Schriever, formed a Strategic Missiles Evaluation Committee chaired by von Neumann, and the Atlas ICBM program, which had been limping along since 1946, was off the ground. The year 1953 was the first one in which more than $1 million was spent on guided missile development by the United States. "Guided" did not imply the precision we take for granted now. "Once it was launched, all that we would know is what city it was going to hit," von Neumann answered the vice president in 1955.

Numerical simulations were essential to the design of weapons that were, as Oppenheimer put it, "singularly proof against any form of experimental approach." When Nils Barricelli arrived in Princeton in 1953, one large thermonuclear calculation had just been completed, and another was in the works. The computer was usually turned over to the Los Alamos group, led by Foster and Cerda Evans, overnight. It was agreed, on March 20, that "during the running of the Evans problem there would be no objection to using some time on Saturday and Sunday instead of operating from midnight to 8:00 a.m." Barricelli had to squeeze his numerical universe into existence between bomb calculations, taking whatever late-night and early morning hours were left.

During the night of March 3, 1953, as Barricelli's numerical organisms were released into the computational wilderness for the first time, Joseph Stalin was sinking into a coma in Moscow, following a stroke. He died two days later -- five months short of witnessing the first Soviet hydrogen bomb test at Semipalatinsk. No one knew who or what would follow Stalin, but Lavrentiy Beria, director of the NKVD secret police and supervisor of the Soviet nuclear weapons program, was the heir apparent, and the U.S. Atomic Energy Commission made it their business to fear the worst. After Barricelli's "Symbiosis Problem" ran without misadventure overnight, the machine log notes "over to blast wave" on the morning of March 4. Later in the day the log simply reads "over to" followed by a pencil sketch of a mushroom cloud.

Three technological revolutions dawned in 1953: thermonuclear weapons, stored-program computers, and the elucidation of how life stores its own instructions as strings of DNA. On April 2, James Watson and Francis Crick submitted "A Structure for Deoxyribose Nucleic Acid" to Nature, noting that the double helical structure "suggests possible copying mechanism for the genetic material." They hinted at the two-bits-per-base-pair coding whereby living cells read, write, store, and replicate genetic information as sequences of nucleotides we identify as A, T, G, and C. "If an adenine forms one member of a pair, on either chain, then on these assumptions the other member must be thymine; similarly for guanine and cytosine," they explained. "If only specific pairs of bases can be formed, it follows that if the sequence of bases on one chain is given, then the sequence on the other chain is automatically determined."

The mechanism of translation between sequence and structure in biology and the mechanism of translation between sequence and structure in technology were set on a collision course. Biological organisms had learned to survive in a noisy, analog environment by repeating themselves, once a generation, through a digital, error-correcting phase, the same way repeater stations are used to convey intelligible messages over submarine cables where noise is being introduced. The transition from digital once a generation to digital all the time began in 1953.

The race was on to begin decoding living processes from the top down. And with the seeding of an empty digital universe with self-modifying instructions, we took the first steps toward the encoding of living processes from the bottom up. "Just because the special conditions prevailing on this earth seem to favor the forms of life which are based on organo-chemical compounds, this is no proof that it is not possible to build up other forms of life on an entirely different basis," Barricelli explained. The new computer was assigned two problems: how to destroy life as we know it, and how to create life of unknown forms.

What began as an isolated 5-kilobyte matrix is now expanding by over two trillion transistors per second (a measure of the growth in processing and memory) and five trillion bits of storage capacity per second (a measure of the growth in code). Yet we still face the same questions that were asked in 1953. Turing's question was what it would take for machines to begin to think. Von Neumann's question was what it would take for machines to begin to reproduce.

When the Institute for Advanced Study agreed, against all objections, to allow von Neumann and his group to build a computer, the concern was that the refuge of the mathematicians would be disturbed by the presence of engineers. No one imagined the extent to which, on the contrary, the symbolic logic that had been the preserve of the mathematicians would unleash the powers of coded sequences upon the world. "In those days we were all so busy doing what we were doing we didn't think very much about this enormous explosion that might happen," says Willis Ware.

Was the explosion an accident or deliberately set? "The military wanted computers," explains Harris Mayer, the Los Alamos physicist who was working with both John von Neumann and Edward Teller at the time. "The military had the need and they had the money but they didn't have the genius. And Johnny von Neumann was the genius. As soon as he recognized that we needed a computer to do the calculations for the H- bomb, I think Johnny had all of this in his mind."



Excerpted with permission from Turing's Cathedral: Origins of the Digital Universe (Pantheon, 2012).

Images: Alan Richards, photographer. Courtesy of the Shelby White and Leon Levy Archives Center, Institute for Advanced Study, Princeton, NJ, USA

This article available online at:

http://www.theatlantic.com/technology/archive/2012/03/1953-the-year-that-revolutionized-life-death-and-the-digital-bit/254013/