{"id":123002,"date":"2025-11-07T14:14:09","date_gmt":"2025-11-07T14:14:09","guid":{"rendered":"https:\/\/www.newsbeep.com\/nz\/123002\/"},"modified":"2025-11-07T14:14:09","modified_gmt":"2025-11-07T14:14:09","slug":"will-ai-start-nuclear-war-what-netflix-movie-a-house-of-dynamite-misses","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/nz\/123002\/","title":{"rendered":"Will AI start nuclear war? What Netflix movie A House of Dynamite misses."},"content":{"rendered":"<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1agbrixi lg8ac51 lg8ac50 xkp0cg1\">For as long as AI has existed, humans have had fears around AI and nuclear weapons. And movies are a great example of those fears. Skynet from the Terminator franchise becomes sentient and fires nuclear missiles at America. WOPR from WarGames nearly starts a nuclear war because of a miscommunication. Kathryn Bigelow\u2019s recent release, House of Dynamite, asks if AI is involved in a nuclear missile strike headed for Chicago.<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1agbrixi lg8ac51 lg8ac50 xkp0cg1\">AI is already in our nuclear enterprise, Vox\u2019s Josh Keating tells Today, Explained co-host Noel King. \u201cComputers have been part of this <a href=\"https:\/\/ahf.nuclearmuseum.org\/ahf\/history\/computing-and-manhattan-project\/\" rel=\"nofollow noopener\" target=\"_blank\">from the beginning<\/a>,\u201d he says. \u201cSome of the first digital computers ever developed were used during the building of the atomic bomb in the Manhattan Project.\u201d But we don\u2019t know exactly where or how it\u2019s involved.<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1agbrixi lg8ac51 lg8ac50 xkp0cg1\">So do we need to worry? Well, maybe, Keating argues. But not about AI turning on us.<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1agbrixi lg8ac51 lg8ac50 xkp0cg1\">Below is an excerpt of their conversation, edited for length and clarity. There\u2019s much more in the full episode, so listen to Today, Explained wherever you get podcasts, including <a href=\"https:\/\/podcasts.apple.com\/us\/podcast\/today-explained\/id1346207297\" rel=\"nofollow noopener\" target=\"_blank\">Apple Podcasts<\/a>, <a href=\"https:\/\/www.pandora.com\/podcast\/today-explained\/PC:140\" rel=\"nofollow noopener\" target=\"_blank\">Pandora<\/a>, and <a href=\"https:\/\/open.spotify.com\/show\/3pXx5SXzXwJxnf4A5pWN2A\" rel=\"nofollow noopener\" target=\"_blank\">Spotify<\/a>.<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1agbrixi lg8ac51 lg8ac50 xkp0cg1\">There\u2019s a part in A House of Dynamite where they\u2019re trying to figure out what happened and whether AI is involved. Are these movies with these fears onto something?<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1agbrixi lg8ac51 lg8ac50 xkp0cg1\">The interesting thing about movies, when it comes to nuclear war, is: This is a kind of war that\u2019s never been fought. There are no sort of veterans of nuclear wars other than the two bombs we dropped on Japan, which is a very different scenario. I think that movies have always played a kind of outsize role in debates over nuclear weapons. You can go back to the \u201960s when the Strategic Air Command actually produced its own rebuttal to <a href=\"https:\/\/www.vox.com\/culture\/2017\/2\/18\/14636354\/movie-of-the-week-strangelove-flynn\" rel=\"nofollow noopener\" target=\"_blank\">Dr. Strangelove<\/a> and <a href=\"https:\/\/www.vox.com\/culture\/23808552\/atomic-bomb-manhattan-strangelove-oppenheimer-pop-culture\" rel=\"nofollow noopener\" target=\"_blank\">Fail Safe<\/a>. In the \u201980s, that TV movie <a href=\"https:\/\/slate.com\/culture\/2016\/05\/on-the-americans-the-jennings-just-watched-the-80s-nuclear-war-movie-the-day-after-its-still-terrifying.html\" rel=\"nofollow noopener\" target=\"_blank\">The Day After<\/a> was kind of a galvanizing force for the nuclear freeze movement. President [Ronald] Reagan apparently was very disturbed when he watched it, and it influenced his thinking on arms control with the Soviet Union.<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1agbrixi lg8ac51 lg8ac50 xkp0cg1\">In the specific topic I\u2019m looking at, which is AI and nuclear weapons, there\u2019s been a surprising number of movies that have that as the plot. And it comes up a lot in the policy debates over this. I\u2019ve had people who are advocates for integrating AI into the nuclear command system saying, \u201cLook, this isn\u2019t going to be Skynet.\u201d General Anthony Cotton, who\u2019s the current commander of Strategic Command \u2014 which is the branch of the military responsible for the nuclear weapons\u2014 advocates for greater use of AI tools. He referred to the 1983 movie WarGames, saying, \u201cWe\u2019re going to have more AI, but there\u2019s not going to be a WOPR in strategic command.\u201d<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1agbrixi lg8ac51 lg8ac50 xkp0cg1\">Where I think [the movies] fall a little short is the fear tends to be that a super intelligent AI is going to take over our nuclear weapons and use it to wipe us out. For now, that\u2019s a theoretical concern. What I think is the more real concern is that as AI gets into more and more parts of the command and control system, do the human beings in charge of the decisions to make nuclear weapons really understand how the AIs are working? And how is it going to affect the way they make these decisions, which could be \u2014 not exaggerating to say \u2014 some of the most important decisions ever made in human history.<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1agbrixi lg8ac51 lg8ac50 xkp0cg1\">Do the human beings working on nukes understand the AI?<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1agbrixi lg8ac51 lg8ac50 xkp0cg1\">We don\u2019t know exactly where AI is in the nuclear enterprise. But people will be surprised to know how low-tech the nuclear command and control system really was. Up until 2019, they were using floppy discs for their communication systems. I\u2019m not even talking about the little plastic ones that look like your save icon on Windows. I mean, the old \u201980s bendy ones. They want these systems to be secure from outside cyber interference, so they don\u2019t want everything hooked up to the cloud.<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1agbrixi lg8ac51 lg8ac50 xkp0cg1\">But as there\u2019s this ongoing multibillion-dollar nuclear modernization process underway, a big part of that is updating these systems. And multiple commanders of StratCom, including a couple I talked to, said they think AI should be part of this. What they all say is that AI should not be in charge of making the decision as to whether we launch nuclear weapons. They think that AI can just analyze massive amounts of information and do it much faster than people can. And if you\u2019ve seen A House of Dynamite, one thing that movie shows really well is how quickly the president and senior advisers are going to have to make some absolutely extraordinary, difficult decisions.<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1agbrixi lg8ac51 lg8ac50 xkp0cg1\">What are the big arguments against getting AI and nukes in bed together?<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1agbrixi lg8ac51 lg8ac50 xkp0cg1\">Even the best AI models that we have available today are still prone to error. Another worry is that there could be outside interference with these systems. It could be hacking or a cyberattack, or foreign governments could come up with ways to sort of seed inaccurate information into the model. There has been reporting that Russian propaganda networks are actively trying to seed disinformation into the training data used by Western consumer AI chatbots. And another is just how people interact with these systems. There is a phenomenon that a lot of researchers pointed out called automation bias, which is just that people tend to trust the information that computer systems are giving them.<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1agbrixi lg8ac51 lg8ac50 xkp0cg1\">There are abundant examples from history of times when technology has actually led to near nuclear disasters, and it\u2019s been humans who\u2019ve stepped in to prevent escalation. There was a case in 1979 when Zbigniew Brzezinski, the US national security adviser, was actually woken up by a phone call in the middle of the night informing him that hundreds of missiles had just been launched from Soviet submarines off the coast of Oregon. And just before he was about to call President Jimmy Carter to tell him America was under attack, there was another call that [the first] had been a false alarm. A few years later, there was a very famous case in the Soviet Union. Colonel Stanislav Petrov, who was working in their missile detection infrastructure, was informed by the computer system that there had been a US nuclear launch. Under the protocols, he was supposed to then inform his superiors, who might\u2019ve ordered immediate retaliation. But it turned out the system had misinterpreted sunlight reflecting off clouds as a missile launch. So it\u2019s very good that Petrov made the decision to wait a few minutes before he called his superiors.<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1agbrixi lg8ac51 lg8ac50 xkp0cg1\">I\u2019m listening through to those examples, and the thing I might take away if I\u2019m thinking about it really simplistically is that human beings pull us back from the brink when technology screws up.<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1agbrixi lg8ac51 lg8ac50 xkp0cg1\">It\u2019s true. And I think there\u2019s some really interesting recent tests on AI models given sort of military crisis scenarios, and they actually tend to be more hawkish than human decision makers are. We don\u2019t know exactly why that is. If we look at why we haven\u2019t fought a nuclear war \u2014 why, 80 years after Hiroshima, nobody\u2019s dropped another atomic bomb, why there\u2019s never been a nuclear exchange on the battlefield \u2014 I think part of it\u2019s just how terrifying it is. How humans understand the destructive potential of these weapons and what this escalation can lead to. That there are certain steps that may have unintended consequences and fear is a big part of it.<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1agbrixi lg8ac51 lg8ac50 xkp0cg1\">From my perspective, I think we want to make sure that there\u2019s fear built into the system. That entities that are capable of being absolutely freaked out by the destructive potential of nuclear weapons are the ones who are making the key decisions on whether to use them.<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1agbrixi lg8ac51 lg8ac50 xkp0cg1\">It does sound like watching A House of Dynamite, you can vividly think that perhaps we should get all of the AI out of this entirely. It sounds like what you\u2019re saying is: AI is a part of nuclear infrastructure for us, for other nations, and it is likely to stay that way.<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1agbrixi lg8ac51 lg8ac50 xkp0cg1\">One thing one advocate for more automation told me was, \u201cif you don\u2019t think humans can build a trustworthy AI, then humans have no business with nuclear weapons.\u201d But the thing is, I think that\u2019s a statement that people who think we should eliminate all nuclear weapons entirely would also agree with.<br \/>I may have gotten into this worried that AI was going to take over and take over nuclear weapons, but I realized right now I\u2019m worried enough about what people are going to do with nuclear weapons. It\u2019s not that AI is going to kill people with nuclear weapons. It\u2019s that AI might make it more likely that people kill each other with nuclear weapons. To a degree, the AI is the least of our worries. I think the movie shows well just how absurd the scenario in which we\u2019d have to decide whether or not to use them really is.<\/p>\n","protected":false},"excerpt":{"rendered":"For as long as AI has existed, humans have had fears around AI and nuclear weapons. And movies&hellip;\n","protected":false},"author":2,"featured_media":123003,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20],"tags":[365,363,364,2396,52126,374,409,111,139,69,49,135,145,1865,63652],"class_list":{"0":"post-123002","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-artificialintelligence","11":"tag-culture","12":"tag-explained-podcast","13":"tag-innovation","14":"tag-movies","15":"tag-new-zealand","16":"tag-newzealand","17":"tag-nz","18":"tag-podcasts","19":"tag-politics","20":"tag-technology","21":"tag-today","22":"tag-world-politics"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/posts\/123002","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/comments?post=123002"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/posts\/123002\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/media\/123003"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/media?parent=123002"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/categories?post=123002"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/tags?post=123002"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}