Eric Schlosser (The Bat Segundo Show #515)
Eric Schlosser is most recently the author of Command and Control.
Author: Eric Schlosser
Subject Discussed: The 1980 nuclear missile accident in Damascus, Arkansas, drug use among the military after the Vietnam War, the Titan II’s continuous deployment even after it was unsafe, Fred Iklé, efforts to point out safety shortfalls in the military, the lack of locks on nuclear weapons, Permissive Action Links, Robert Peurifoy, Curtis LeMay, the real-life inspiration for Dr. Strangelove, Kennedy and the Cuban Missile Crisis, LeMay’s bellicose attitude, 1960s defense culture, alternative perceptions of LeMay, LeMay’s attention to detail, checklists and operating procedures, a giant nuclear arsenal intended as a deterrent, limited vs. total war, Theodore Roosevelt and the Rough Riders, the influence of male attitude on deterrents, what we needed from LeMay during the Cold War, the risks taken by Strategic Air Command officers, recent safety mishaps with U.S. nuclear missile units, responding to speculation that LeMay wanted to start World War III, the history of the the Strategic Air Command, why the Command and Control system couldn’t factor in human exhaustion, how the arms race between the United States and the Soviet Union caused unreasonable labor for servicemen, the difficulties of accounting for all nuclear weapons, Robert McNamara’s belief that mad decisions were logical at the time, the 1978 Titan II accident in Rock, Kansas, why there were so many mishaps with Titan II oxidizer, the RFHCO suits (and the astonishing wear-and-tear on this protection, which wasn’t replaced in many cases), blame directed at the workers, how systemic problems contributed to an unrealistic and bureaucratic view of Air Force servicemen, putting men into dangerous systems with defective gear, black electrical tape used to “secure” suits in missile silos, loose arming wires that permitted bombs to drop, an atomic bomb that nearly went off in North Carolina in 1961, the missing atomic bomb still entombed in Nahunta Swamp, the H-bomb accidentally dropped on Spain in 1966, why America’s military infrastructure still relies on aging B-52 bombers, the 1968 Thule Air Base B-52 crash, the importance of morale within workers who are tending to the most dangerous machines in the world, Kissinger’s efforts to get rid of the Titan II, the Single Integrated Operational Plan, Eisenhower’s deadly arbitration between the Air Force and the Navy, how the Strategic Air Command kept SIOP details away from Kissinger, bureaucratic rivalries, William Odom’s briefing on the SIOP, the interplay between Kennedy and Khrushchev, the lack of a direct line between the United States and the Soviet Union through the Cuban Missile Crisis, the effect of nuclear policy on diplomacy, miscommunication and unreliable back channels, the present nuclear risk in South Asia, the recent terrorist attack in Nairobi’s Westgate Mall, the Peshawar church attack, Edward Snowden’s findings about the United States’s lack of information about Pakistan’s nuclear weapons, conflict between India and Pakistan, why deterrence theory doesn’t apply to religious fanatics, how storage facilities are prime targets for adversaries and how this is a serious problem in Pakistan.
EXCERPT FROM SHOW:
Correspondent: Much of the missile culture that you describe in this book is, to say the least, remarkably unsafe. You have this Arkansas Titan II mishap at Launch Complex 374-7 — the so-called Damascus accident — that forms the backbone of this book. It all hinges on a socket wrench that is dropped down a silo and pierces this fuel tank. And if that isn’t enough, you have oxidizer that permeates around the station, causes 22-year-old David Livingston to die and several men to be severely wounded. You have procedures that aren’t followed. You have the Propellant Transfer System team violating this careful two man policy, where you have two keys turning at the same time. We often see that in the movies, but this was, in fact, the policy during the mid-20th century and onward to activate the missile. So this leads me to ask. How did the Launch Complex adopt such a far from cautious approach to daily maintenance duties? How much of this was systematic? And how much of this was human error?
Schlosser: You know, our military was in pretty rough shape after the Vietnam War. And it’s a forgotten fact now because we have our precision-guided munitions and we go to war with one country after another. And we seem so completely dominant. But after the Vietnam War, there was major underinvestment in the military. There were terrible morale problems. Huge percentages of our troops in the Army came back addicted to heroin. And there were remarkably high levels of illegal drug use even in the Air Force. So in the book, I write about guys smoking pot on the missile site. Launch officers who have responsibility for launching intercontinental ballistic missiles smoking pot. In the book, I go into guys who are busted with LSD and cocaine who had nuclear weapons responsibilities. And that was part of the bigger culture of the ’70s. There was really poor morale. The Titan II missile that’s at the heart of this story was obsolete. It was supposed to be taken out of service in the late 1960s. In 1980, it was still on alert. It was leaking all the time. They didn’t have spare parts. So the accident I write about is precipitated by somebody dropping a socket in a missile silo and the socket pierces the skin of the missile. But in many ways, this weapons system was an accident waiting to happen. And in my criticism of some of the procedures in the 1970s and early 1980s, I really don’t imply a criticism of the ordinary servicemen who worked on these weapons systems. On the contrary. The book is about the incredible heroism of ordinary guys put in custody of nuclear weapons at a remarkably young age. I’m getting old enough now. So when I think of a nineteen, twenty, twenty-one year old person having custody of a thermonuclear weapon, it takes me aback. And those guys again and again in the Cold War had this responsibility. Maybe some of them were smoking pot. But most of them were quite serious about not wanting these weapons to go off and about making sure that we were safe from attack from the Soviet Union.
Correspondent: What’s fascinating is why so many young and inexperienced greenhorns would be given such responsibility for these missiles. I mean, that’s what boggles the mind. Especially since there were several other accidents. The B-52s that you have dropping bombs that thankfully didn’t have charged warheads. I mean, this is very serious. This isn’t asking regular people to go ahead and mop up the floor of a hangar. This is our system. And I’m wondering what conditions allowed this to perpetuate for so long, notwithstanding the heroism that you depict in your book.
Schlosser: The accident that is the central narrative of the book — the Titan II accident in Damascus, Arkansas — was unquestionably linked to understaffing, to poorly trained personnel, shortage of spare parts, an obsolete weapons system. But I write about many other accidents in the book. And in those accidents, they had extremely well-trained personnel. They had the most modern weapons imaginable. The best systems in place imaginable. But one of the big themes of the book is that we’re a lot better at creating complex technologies than we are at controlling them or managing them. And it’s hard to think of a machine that doesn’t mess up. From your toaster oven to your laptop to commercial airliners to commercial nuclear power plants, no matter how sophisticated the people and no matter how well-trained, it’s just beyond fallible, imperfect human beings to create something that’s infallible and perfect. And when you’re talking about nuclear weapons, you’re talking about the most dangerous machines ever invented. So it makes sense that those machines and the complex technological systems that manage them would mess up occasionally. But the consequences of one of those things screwing up is a lot more than if your toaster oven freezes up and catches on fire in the kitchen. And one of the reasons I wrote the book was, firstly, I thought the story of this missile accident was just unbelievable. And I thought the heroism of the guys who tried to save the missile was unbelievable. But I’m just trying to remind people that these things are out there. There are thousands of nuclear weapons right now that are ready to go. And people my age — I’m 54; I grew up in the Cold War — remember what it was like to live with this dread that there might be an all-out nuclear war any day. But half the people who live in the United States either weren’t born yet or were small children when the Soviet Union vanished. And there’s a historical amnesia. And most people just don’t realize these weapons are there. Now I’m not trying to create an existential dread in anybody. I’m not trying to create late night anxiety. But this is really important information. And people need to know it.
Correspondent: Well, even in the 1950s, you have this guy named Fred Iklé. He disseminates this RAND report on nuclear weapon safety and he outlines all the motivations that would cause someone to disobey orders and set off a nuclear weapon. His reports were disseminated, as you point out in the book, to the highest levels of the Air Force and the Department of Defense. You’ve got this guy Bob Peurifoy. He points to numerous safety problems as well. Your book, as we have established, documents several incidents in which safety is sacrificed for ease of intercontinental ballistic missile use in retaliation. Why was the top brass so recalcitrant against safety? Why were they so interested in taking shortcuts? I mean, it seems to me that it goes beyond a cultural problem or an institutional problem and into just pure recklessness.
Schlosser: Yeah. And that top secret report by Iklé was really cool to read. I mean, it was looking into the ways that not only mechanically these weapons could go wrong, but I think in one section of the report there’s a catalog of derangement, which is looking at what sort of psychological problems airmen might have that would lead them to deliberately set off a weapon, deliberately steal a weapon. And at the time that Iklé was writing, literally there were no locks on our nuclear weapons. There were no coded switches. So a pilot who wanted to take a weapon could just fly to the target in the Soviet Union, in the Eastern Bloc, or even in the United States and just drop it, if he or she wanted to do that. And Iklé’s report was important in one respect. It really encouraged what you’d think would be a no brainer, but nobody had thought of doing. Putting locks on the weapons. The locks that were eventually put on them — these coded switches called Permissive Action Links — they were effective. But to someone who really understood the innards of the weapon, in a few hours, they could just disable it. Robert Peurifoy is one of the heroes of the book. He’s a weapons designer at one of the weapons labs: the Sandia National Laboratories. He realized in the late 1960s that our nuclear weapons just weren’t safe enough and that during what is called an “abnormal environment” — I mean, you could argue that the whole history that I write about in the book is an abnormal environment. But at the weapons labs, they refer to abnormal environments as a fire, a subversion of a weapon, a plane crash, a lightning strike.
Correspondent: Amazingly, no acronym.
Schlosser: Right. He realized that our weapons were not safe enough in these abnormal environments. But it took him like a twenty year bureaucratic battle to get modern safety devices put in the weapons, which we now have today. But you would never build a nuclear weapon today in the United States without these sorts of safety devices. Ultimately, at the heart of the problem, firstly is a sort of bureaucratic mentality. Someone said recently to me, “If you want to understand how bureaucracies work, it’s better to be wrong than be alone.” So it takes somebody with some personal courage to stand up, willing to buck the bureaucracy, and be alienated from everyone else as a result. And there also has been since the very first nuclear weapon was deployed an internal contradiction, a tension between wanting the weapons to be as safe as possible and wanting them to be as reliable as possible and available for immediate use. If you wanted the weapons to be totally safe, you would never fully assemble them until you’re about to use them. But if you want to use them within 45 seconds or a minute, you need to have them fully assembled on top of the missile or inside the bomber and ready to go. So in the book, I talk about this internal tension between always wanting to be able to use the weapon and never wanting it to go out by accident. And again and again, throughout our nuclear history, the military preferred always to never. And some of it I can understand. These bomber pilots knew that if they got the order to go bomb the Soviet Union, they would be flying into an environment that no pilot had ever flown into before. Missiles would probably have already hit the Soviet Union. They would be flying into this atomic wasteland, in many ways, to drop their bombs and they were probably on one-way suicide missions to do so. And yet they were willing to do it. And it would be a bummer for them if they risked everything to take out a Soviet Union airfield and that weapon turned out to be a dud because there were too many safety devices on it. Having said that, I would have voted for greater safety. Robert Peurifoy, the Sandia engineer, felt like we could make the weapons reliable enough and still make them much safer. And eventually his view prevailed. But it took…
Schlosser: Almost two decades. And I think it really hurt his career. He wound up being a vice president at this weapons lab. But he was ostracized. It hurt his career. And I think it was enormously stressful for him to live with this constant knowledge that one of our weapons might detonate and be trying to fight the system to make them safe. And he prevailed. But it was a pretty stressful job.