General · 31st October 2013
On January 23, 1961, an American Strategic Air Command B-52 bomber, on a routine flight along the northeast coast of the US, went into a fatal tailspin after losing a wing over Goldsboro, North Carolina. Military aircraft sometimes crash but this event was special because of the two Mark 39 Mod 2 hydrogen bombs on board. Each had the explosive power of 4 million tons of TNT, about 260 times more powerful than the atomic blast that instantly killed an estimated 135,000 people at Hiroshima in 1945.
But the 1961 event was special for another reason. As the aircraft broke apart during its descent, both bombs tore loose from their moorings in the bay of the bomber. One was found in a meadow off Big Daddy's Road. The other, its opened parachute draped over a tree, was easily located in a field close to the nearby town of Faro. It was the Faro bomb that sent a shiver of chill through the nuclear experts.
Somehow, while being ejected from the disintegrating aircraft, the safety pin on this bomb was pulled free, an event that simulated the manual removal of the pin as the first stage in the activation sequence that was supposed to culminate in a nuclear explosion. The bomb thought it was falling toward a real target. The batteries began supplying power to the operating systems. The arming rods were pulled. All the timing switches were energized. The parachute deployed. The last step needed for detonating the bomb was a single device called a “pre-arming ready-safe switch” that moved the final explosive process from “off” to “on”. This 28-volt switch needed to be activated by a signal from the B-52's cockpit — fortunately, a signal that did not come. However, this safety device was known to be notoriously unreliable. On at least 30 previously recorded occasions it had malfunctioned to the “on” position. If the switch had failed on this occasion, if a single circuit had inadvertently closed, a 4 megaton hydrogen bomb would have exploded over Goldsboro, North Carolina.
Goldsboro would have disappeared in a flash of intense light and heat. The detonation would have incinerated a portion of North Carolina. Lethal radioactive fallout would have killed millions and threatened additional millions in Washington D.C., Baltimore, Philadelphia, and as far away as New York city. America and the world was as close to a nuclear catastrophe as an unreliable switch. But this event was just one of “at least 700 'significant' accidents and incidents involving 1,250 nuclear weapons recorded between 1950 and 1968 alone” (Macedonian International News Agency, Sept. 21/13).
This B-52 cautionary tale corresponds to another that occurred during the October, 1962, Cuban Missile Crisis when a USSR submarine commander refused a captain's and crew's request to fire a nuclear-tipped torpedo at US warships, an event that could have triggered a nuclear war. And it's comparable to a situation in January, 1995, when USSR President Boris Yeltsin — who, fortunately, was not drunk at the time — refused to authorize a missile attack on the United States because he didn't believe the US would attack them — the incoming American missile turned out to be a Norwegian scientific rocket sampling the electrical charges in the aurora borealis.
These events should serve as reminders of how frequently our technology brings us to the edge of the unthinkable. The tenuous difference between ingenuity and disaster can be little more than an unreliable switch that happens to work, an imaginative leap that initiates an act of defiant bravery, a lucid hunch that fails to believe in faulty evidence. Our technology has brought us to the place where safety is often as precarious as good luck.
Margaret Atwood, in a CBC interview about her latest book, MaddAddam (Sept. 30/13), is well aware of this precariousness. “Human intention,” she said, “is scary.” Her twin fears are climate change and genetic engineering. Each is a high-risk activity. In the first case, using the atmosphere as an industrial sewer for billions of tonnes of carbon dioxide has disturbing effects on weather, species, agriculture, sea levels, ocean acidification, ecological integrity and the stable social structures we need for economic order and political civility. As for the genetic modification of organisms, inventing artificial life forms while mixing plants and animals into wholly unnatural combinations are also “scary” enterprises if we give a moment of thought to the huge consequences of possible miscalculations.
We are assured, of course, by flushes of undue optimism and surges of wanton pride, that our adventurous behaviour is manageable risk. All the necessary precautions have been taken. The ubiquitous chemical concoctions that inhabit our daily lives are deemed harmless. Engineered safeguards will avoid the environmental hazards of mining, oil drilling and fracking. Forests will be wisely managed. Nanotechnology can be trusted. Banks will be adequately regulated. Corporations will serve the pubic good. Salmon farms will not import foreign viruses potent enough to collapse wild marine ecologies. Things are under control — until they aren't.
Granted, the very act of living is a succession of risks. But the new risks in our technological world are growing disproportionally larger than the benefits. The consequences of mistakes are magnified by the rising influence of our human presence. We've adjusted so well to living with escalating risk that we don't expect to be blasted out of existence by an errant hydrogen bomb.