Are America’s nuclear systems so old they’re un-hackable?
As the Cold War drew to a close a surprising contender emerged as the third largest nuclear power on earth: Ukraine. The country was home to some 5,000 nuclear weapons, placed there by Moscow when Ukraine was still part of the Soviet Union. Kyiv sent the weapons back to Russia in exchange for security guarantees from the U.S. and Britain and a promise from Moscow that it would respect its sovereignty.
Then, President Vladimir Putin invaded in February.
The nuclear option, which many thought had been largely removed from the table, was one of the first sabers Putin chose to rattle when he announced that Russian troops were moving into Ukraine in February. He reminded the world that not only did Russia possess nuclear weapons, but it was prepared to use them. Anyone who “tries to stand in our way,” he said, will face consequences “such as you have never seen in your entire history.”
The threat raised an uncomfortable question: After decades of pursuing disarmament talks and assuming nuclear confrontation was a bridge too far, was the United States ready for the ultimate confrontation with Russia?
Right up until three years ago, U.S. nuclear systems were using eight-inch floppy disks in a IBM System 1 computer first introduced in 1976. It was not connected to the internet and required spare parts often sourced from eBay. Some analysts think America’s slow-walk toward modernization of its nuclear systems may turn out to have been a canny strategy: because the systems are so old, they are practically un-hackable.
“There is a truism about computers, which is that when we have a computer, we always want it to do more,” said Herb Lin, a professor at Stanford University and author of a new book called Cyber Threats and Nuclear Weapons, which looks at the risk of cyber attacks across the entire nuclear enterprise. He says the “more” problem inevitably introduces vulnerability into the system, and defense officials have to think carefully about how to modernize.
“If you'll grant the point that the more you want a computer system to do, the more complex the system is that you have to build, then you then you take the second step and you realize that complexity is the enemy of security,” he says. And that, Lin maintains, is where things start to go wrong.
Run the probabilities, he says, and there’s a chance that one of those many complex components could be vulnerable to a hack in a way no one had considered before.
Stuxnet
The cautionary tale is Stuxnet, the virus and worm that found its way into the Natanz uranium enrichment plant in Iran in 2009 and 2010. Stuxnet – which appears to have been the brainchild of U.S. and Israeli intelligence services – was able to take control of centrifuges used to enrich uranium gas inside the plant and, without anyone noticing, get them to spin so fast they broke down.
For a long time, the cause of the centrifuge failures was a complete mystery. Scientists were fired – officials thought they were sleeping on the job or not maintaining the systems properly. It never occurred to anyone – until much later – that a cyber weapon could possibly find its way into a system that was air-gapped from the Internet and so closely watched. Stuxnet had probably been in their systems a year before they even discovered it.
The thinking has been that America’s geriatric nuclear weapons systems may actually provide an inoculation from this kind of attack. “Many of the systems right now are so old that there's nobody or few, very few, people who know how to get at them,” Lin says. “So right now the current assessment is that the nuclear command and control system anyway…is mostly robust against a cyber threat.”
Hayat Alvi, a professor at the U.S. Naval War College, studies these kinds of nuclear weapons issues. (She spoke to The Record in her personal capacity.) She says she has a mantra when it comes to our nuclear weapons systems: “If it’s not broken, it doesn’t need to be fixed.”
Alvi says the calculus is pretty simple: “Why try to change something that has worked for decades?” she added. “Assuming you change to upgrade them to modern technology, you are actually inviting more risks and potential threats and sabotage into the system.”
While officials have tinkered at the edges of the nuclear weapons systems, John Lauder, who used to direct the CIA’s Nonproliferation Center, says “most of the systems we’re using are from the ‘70s and ‘80s.”
He said there has been a general sense from people who worked in arms control that “we had put together a set of agreements that would keep peace and stability.” As a result, modernizing nuclear weapons systems seemed less important since there was a general sense the weapons would eventually be phased out. “Ukraine,” he said, “was a wake-up call.”
‘One horrifying moment…’
In 1979, about three years after the U.S. nuclear weapons program adopted the state-of-the-art IBM Series/1 computer, William Perry, who was a top Pentagon official at the time, got a phone call. The voice on the other end identified himself as the watch officer.
“The first thing he said to me was that his computers were showing 200 nuclear missiles on the way from the Soviet Union to the United States,” Perry, who would later go on to be Defense Secretary in the Clinton administration, recounted in his podcast, At the Brink. “And for one horrifying moment, I believed we were about to witness the end of civilization.”
As Perry weighed the possibilities, he concluded this had to be some kind of mistake. There was nothing going on in the world at the time that would have caused the Soviet Union to suddenly strike. Perry asked the watch commander to find out what had gone wrong with the systems so he could explain what happened to the president in the morning.
It turns out, someone had accidentally put a training tape into the computer instead of an operating one. As a result, what the computer saw was a simulation of an actual attack. It looked real because it was designed to look real. Perry says that night fundamentally changed the way he thought about nuclear weapons. He came to the conclusion that simple human error could indeed lead to nuclear war.
“It's changed forever my way of thinking about nuclear weapons,” he said. “Up until this, a false alarm, an attack by mistake, starting a nuclear war by mistake was a theoretical issue.”
Until it wasn’t.
Modern ‘duck and cover’
In 2018, Jess Franklin, a military wife and stay-at-home mom stationed in Hawaii, was sleeping with her toddler on one side of her and her preschooler on the other. “I remember very clearly that it was the coziest I've ever felt,” she said, “and then this text comes in and I just kind of stared at it in disbelief for a little while.”
The text said that the U.S. Pacific Command detected a missile threat to Hawaii.
“I just kind of stared at it and thought, well, I mean, if that's true, really nothing you can do,” she said. “So I could just go back to sleep. Cause if I'm going to die in a few minutes, I'd rather be all snuggled up with my kids.”
Eventually, she grabbed some tablets and books and took the children to a bathroom downstairs to get away from the windows. “I mean, like with a tornado, you want to just get where a blast isn't going to send shards of glass flying at you,” she said, adding that the alert didn’t last that long on base. They got the “all clear” over loudspeakers a short time later.
People outside the base on the island had to wait 38 minutes before they found out they weren’t in danger.
“I do wake up in the middle of the night, worried about these things, and the Ukraine crisis has added to that worry,” Lauder, the former CIA non-proliferation official told me. “The Ukraine crisis, the competition with China, concerns about the other states… continuing to develop their nuclear capabilities, and the mechanisms that deliver nuclear weapons kind of remind us all that we're still in a world where deterrence matters.”
And while he agrees that modernizing our systems does present some risks – cyber threats among them – he says the systems are past their design life and new technologies could go a long way toward preventing mistakes like the one that happened in Hawaii. “We're in this world of artificial intelligence and machine learning, and we can use machines to help us make decisions as humans,” he said.
Thirty years ago, Ukraine became the poster child for arms-control when it sent the last of its nuclear weapons back to Russia. Now, Ukrainians are wondering aloud whether keeping some of those warheads might have made Putin think twice about invading. More broadly, disarmament experts are asking whether Putin’s invasion will give rise to the feeling that other nations have to arm themselves with nuclear weapons to prevent their neighbors from doing the same.
Sean Powers and Will Jarvis contributed reporting to this story.
Dina Temple-Raston
is the Host and Managing Editor of the Click Here podcast as well as a senior correspondent at Recorded Future News. She previously served on NPR’s Investigations team focusing on breaking news stories and national security, technology, and social justice and hosted and created the award-winning Audible Podcast “What Were You Thinking.”