Louis Ridenour’s “Pilot Lights of the Apocalypse” (1945) November 20, 2012Posted by Will Thomas in Uncategorized.
Tags: Edward Bowles, Louis Ridenour
One of the persistent anxieties of the nuclear age has been the possibility of a catastrophic war starting accidentally. The scale of destruction that can be delivered in a short space of time means that any defense and counter-attack would have to be mounted so quickly that a counter-attack could be triggered by false evidence of attack. In fact, a Soviet colonel has just won an award from Germany for not responding when, in 1983, his early-warning systems indicated the launch of five missiles from United States territory. (The incident first became public in 1998.) That warning had been triggered by a satellite receiving confusing signals from reflected solar light.
There have, of course, been other such incidents in fact and fiction — of the latter, the films Dr. Strangelove and Fail-Safe (both 1964) are canonical. When I was researching my dissertation, though, I was surprised to learn that the notion of accidental nuclear war extends all the way back to 1945. While working in the Library of Congress archives, I stumbled on a November 1945 draft of a darkly humorous playlet written by physicist Louis Ridenour called “Pilot Lights of the Apocalypse”. The scenario is almost exactly the same as later incidents, real and imagined. In this case, San Francisco is struck by an earthquake, which triggers a false alarm in an operations room buried beneath the city, prompting a “hysterical” colonel to launch a counter-strike against Denmark, who then launches a strike against Sweden, and so on into oblivion.
The playlet was published in Fortune, of all places, in January 1946. It’s not unknown today, but it’s not widely known either. Thanks to Google, you can read a copy of it online from the December 26, 1945 Milwaukee Journal (Merry Christmas, Milwaukee). Discussion below the fold.
The most interesting thing to me about Ridenour’s playlet is that he had not worked on the atomic bomb. He had instead been a senior figure in radar development at the MIT Rad Lab, and was seconded to Edward Bowles’s office in the Pentagon for much of the war.* Bowles was an MIT electrical engineer who found himself placed directly under Secretary of War Henry Stimson as chief “expert consultant”. The job of Stimson’s expert consultants was to go to the combat theaters to liaise between the Pentagon and the field commands in order to help implement new technologies (especially radar) in the most useful ways, but also to determine what the field requirements for new equipment were so that available equipment could be deployed to the most appropriate places. (Field studies also provided equipment designers with useful information about what specifications would prove most useful, but Stimson’s office had no direct authority over R&D priorities).
The upshot of all of this is that Ridenour had a keen appreciation of the difficulties of using radar to do its intended job, including scrambling fighters to defend against an enemy attack. These included: 1) getting the proper equipment; 2) getting it to work right; 3) knowing what the radar is showing you; 4) knowing the ways it can lead you astray; 5) making a decision about what to do, given knowledge of (3) and (4); and 6) designing procedures (e.g. interception tactics) that will allow that decision to be carried out effectively. For all these steps to take place required intricate coordination of the human and technical elements of the system.
During and after the war, there was a profound sensitivity among those in the know concerning the likelihood of failure in the operation of these complicated systems. Just the other day, the 1970 film Tora! Tora! Tora!, essentially a reconstruction of the Japanese attack on Pearl Harbor, was on TV here. Three decades after the event, the film retained that sensitivity of the World War II era, fussing over procedural failures that might have alerted the American military to the attack before it began (as well as similar detail-oriented questions, such as the political implications of the Japanese declaring war slightly after the attack, rather than slightly before, as planned).
Ridenour manages to richly evoke these themes in a very small amount of space. Hanging on the wall of the operations room, buried beneath San Francisco, is a “framed motto: ‘Remember Pearl Harbor’”. A general explains the room’s procedures to a visiting President of the United States: “Well, the defense has to move quickly, or it’s no good at all. They don’t have time to think. But counter-attack — well, counter-attack has to move quickly, too. But we want them to have time to decide what they need to do.”
This decision-making process takes place in a fairly interesting future, in which nuclear weapons have proliferated to nations across the globe, meaning that, if a nuclear attack is detected, a big problem is figuring out who initiated it. The general explains:
An attack might be staged entirely by mines planted inside our borders, so there wouldn’t be any direction connected with it. And then again, we have pretty good information that some other countries besides us have got bombs up above the stratosphere, 800 miles above the earth, going round us in orbits like little moons. We put up 2,000 and we can see about 5,400 on our radar. Any time, somebody can call down that odd 3,400 by radio and send them wherever they want. There’s no telling from trajectory which nation controls those bombs.
The operations room is very much like a wartime operations room: it is an information-sorting facility, but, in this case, the information needed to determine who launched an attack is “mainly political.” The general goes on: “Radar doesn’t do [the men in the counter-attack center] any good. What they need is intelligence, and that’s what comes in all the time, as complete and up to date as we can get it, on the teletypes. In the defense center you saw scientists and technicians. The officers here are political scientists.”
The room also has a board with pilot lights (here meaning an indicator light) representing different cities across the globe, which receive data from multiple transmission stations in each city. Each transmission station has four “unattended radio transmitters”. If one transmitter fails, another backs it up; if all transmitters fail in a station, the pilot light turns yellow indicating partial destruction of a city. If all stations in a city lose all their transmitters, the light turns red indicating total destruction. Additionally, every defense command has a backup operations room “to guard against accidents”.
On top of all this, there are the people who staff the room. After the President leaves, one of the colonels admonishes another for his over-the-top patriotism, telling him that he talks “like a damn high school kid” and that “good sense and judgment” are needed for their job. And, indeed, immediately thereafter they feel the room shake and San Francisco turns red on their console. The patriotic colonel panics and demands an enemy identification. Denmark, it turns out, has “the highest negative rating in the latest state department digest” due to Danish public distaste for a statue by a San Franciscan sculptor that the US gave to the city of Copenhagen. Global nuclear apocalypse ensues.
What’s really amazing is how many of the features of later nuclear near-misses Ridenour put his finger on: the centrality of information processing systems, the use of remote indicators of nuclear attack, the possibility of false indications, the likelihood that multiple backups could simultaneously fail for an unexpected reason, the likelihood that decisions delinked from central authority for the sake of reliable response could provoke a massive war, and, of course, the crucial role of human judgment as a backstop against catastrophe.
*There are no thorough published accounts of the activities of Bowles’ office, but good descriptions can be found in Daniel Kevles’s The Physicists and Martin Collins’s Cold War Laboratory.