{"id":562536,"date":"2026-03-25T04:03:08","date_gmt":"2026-03-25T04:03:08","guid":{"rendered":"https:\/\/www.newsbeep.com\/au\/562536\/"},"modified":"2026-03-25T04:03:08","modified_gmt":"2026-03-25T04:03:08","slug":"australias-new-military-ai-policy-comes-at-a-crucial-time-the-challenge-is-turning-it-into-practice","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/au\/562536\/","title":{"rendered":"Australia\u2019s new military AI policy comes at a crucial time. The challenge is turning it into practice"},"content":{"rendered":"<p>Artificial intelligence (AI) is playing a central role in the ongoing Middle East war. The United States, for example, has <a href=\"https:\/\/x.com\/CENTCOM\/status\/2031700131687379148\" rel=\"nofollow\">confirmed<\/a> it is using the technology to identify potential targets and accelerate decision-making.  <\/p>\n<p>This is part of a <a href=\"https:\/\/theconversation.com\/israel-accused-of-using-ai-to-target-thousands-in-gaza-as-killer-algorithms-outpace-international-law-227453\" rel=\"nofollow noopener\" target=\"_blank\">growing trend<\/a>. And in some <a href=\"https:\/\/theconversation.com\/israel-accused-of-using-ai-to-target-thousands-in-gaza-as-killer-algorithms-outpace-international-law-227453\" rel=\"nofollow noopener\" target=\"_blank\">cases<\/a> it\u2019s leading to mounting <a href=\"https:\/\/www.nytimes.com\/2026\/03\/11\/us\/politics\/iran-school-missile-strike.html\" rel=\"nofollow noopener\" target=\"_blank\">civilian deaths<\/a>. <\/p>\n<p>Against this backdrop, Australia\u2019s Department of Defence has just released a new <a href=\"https:\/\/www.defence.gov.au\/sites\/default\/files\/2026-03\/Policy-Settings-for-Responsible-Use-of-Artificial-Intelligence-in-Defence-%5BOFFICIAL%5D.pdf\" rel=\"nofollow noopener\" target=\"_blank\">AI policy<\/a>.  <\/p>\n<p>The policy aims to govern the Australian military\u2019s use of AI. So what does it include? And how does it compare to the military AI policies of other countries? <\/p>\n<p>Three main requirements<\/p>\n<p>Australia\u2019s policy establishes three overarching requirements for the Department of Defence\u2019s use of AI. <\/p>\n<p>Firstly, the use of AI must comply with Australian law and international obligations. <\/p>\n<p>Secondly, the use of AI must be underpinned by individual accountability and bounded by consideration of impacts on people. It must also be explainable, reliable and secure, and designed to mitigate unintended bias and harm. <\/p>\n<p>Thirdly, any risks associated with the use of AI must be managed with proportionate control measures, such as testing, training and evaluation. <\/p>\n<p>The policy\u2019s emphasis on proportionate controls is notable.<\/p>\n<p>AI is not a standalone item. It is an enabling technology with <a href=\"https:\/\/unidir.org\/publication\/artificial-intelligence-beyond-weapons-application-and-impact-of-ai-in-the-military-domain\/\" rel=\"nofollow noopener\" target=\"_blank\">many applications<\/a> that can be embedded across a range of different military functions, such as targeting, logistics, training and maintenance \u2013 each raising different risks.  <\/p>\n<p>The policy aims to cover all AI technologies, from chatbots to the most advanced \u201cfrontier\u201d general-purpose AI models.<\/p>\n<p>The approach echoes the Australian government\u2019s <a href=\"https:\/\/www.digital.gov.au\/ai\/ai-in-government-policy\" rel=\"nofollow noopener\" target=\"_blank\">Policy for the Responsible Use of AI in Government<\/a>, which took effect in September 2024. <\/p>\n<p>That policy explicitly carves out the defence portfolio and national intelligence community. The new policy fills that gap.<\/p>\n<p>Thin on details<\/p>\n<p>The policy says little about how the Army, Navy and Air Force \u2013 or other defence entities such as the <a href=\"https:\/\/www.asca.gov.au\/\" rel=\"nofollow noopener\" target=\"_blank\">Australian Strategic Capabilities Accelerator<\/a> \u2013 will actually enact its requirements. <\/p>\n<p>It also says testing and evaluation of the defence department\u2019s use of AI will serve as a key control measure. But it offers no detail on how this will be conducted for military AI \u2013 a domain where testing poses <a href=\"https:\/\/www.cnas.org\/publications\/commentary\/military-artificial-intelligence-test-and-evaluation-model-practices\" rel=\"nofollow noopener\" target=\"_blank\">well-documented challenges<\/a> around unpredictable behaviours and unreliable performance in military operating environments. <\/p>\n<p><a href=\"https:\/\/www.apsc.gov.au\/initiatives-and-programs\/workforce-information\/research-analysis-and-publications\/state-service\/state-service-report-2024-25\/ways-working-ai-aps\/supporting-decision-advantage-defence-ai-centre\" rel=\"nofollow noopener\" target=\"_blank\">The Defence AI Centre<\/a>, established in 2024, is identified as the governance hub. But the policy is thin on implementation, compliance, monitoring, resourcing, or reporting. <\/p>\n<p>How these settings evolve and whether guidance on the implementation of them will follow \u2013 and be made public \u2013 remains to be seen.<\/p>\n<p>Drawing on precedent<\/p>\n<p>Australia\u2019s policy draws on those of its closest allies. <\/p>\n<p>For example, the United Kingdom adopted its <a href=\"https:\/\/assets.publishing.service.gov.uk\/media\/62a7543ee90e070396c9f7d2\/Defence_Artificial_Intelligence_Strategy.pdf\" rel=\"nofollow noopener\" target=\"_blank\">Defence AI Strategy<\/a> in 2022 and issued the <a href=\"https:\/\/www.gov.uk\/government\/publications\/jsp-936-dependable-artificial-intelligence-ai-in-defence-part-1-directive\" rel=\"nofollow noopener\" target=\"_blank\">Dependable AI in Defence<\/a> directive in 2024. <\/p>\n<p>The UK has moved further to appoint \u201cresponsible AI\u201d officers within each Ministry of Defence component. It also published a <a href=\"https:\/\/www.gov.uk\/government\/publications\/laying-the-groundwork-responsible-ai-senior-officers-report-2025\" rel=\"nofollow noopener\" target=\"_blank\">progress report<\/a> in 2025. <\/p>\n<p>In 2020, the United States Department of Defense adopted AI <a href=\"https:\/\/www.war.gov\/News\/News-Stories\/article\/article\/2094085\/dod-adopts-5-principles-of-artificial-intelligence-ethics\/\" rel=\"nofollow noopener\" target=\"_blank\">ethics principles<\/a>. Two years later, it developed a detailed <a href=\"https:\/\/media.defense.gov\/2022\/Jun\/22\/2003022604\/-1\/-1\/0\/Department-of-Defense-Responsible-Artificial-Intelligence-Strategy-and-Implementation-Pathway.PDF?utm_source=chatgpt.com\" rel=\"nofollow noopener\" target=\"_blank\">implementation strategy<\/a>. Then in January 2026, the current administration announced its <a href=\"https:\/\/media.defense.gov\/2026\/Jan\/12\/2003855671\/-1\/-1\/0\/ARTIFICIAL-INTELLIGENCE-STRATEGY-FOR-THE-DEPARTMENT-OF-WAR.PDF\" rel=\"nofollow noopener\" target=\"_blank\">AI Strategy for the Department of War<\/a>. This shifted emphasis toward speed and lethality, mandating \u201cany lawful use\u201d of AI (which doesn\u2019t always equal ethical use) and directing removal of barriers to rapid deployment.<\/p>\n<p>Australia\u2019s defence AI policy generally aligns with the core elements of these like-minded militaries: AI must be used lawfully, humans must remain accountable, and risks must be anticipated, avoided and mitigated. <\/p>\n<p>One notable difference in Australia\u2019s policy is its reference to Article 36 of Additional Protocol I of the Geneva Convention. The policy mandates legal reviews of AI in weapon systems \u2013 a meaningful commitment <a href=\"https:\/\/legalreviewportal.org\/\" rel=\"nofollow noopener\" target=\"_blank\">few states have enacted<\/a>. <\/p>\n<p>Another difference is that Australia\u2019s policy lacks the implementation roadmaps found in the US and UK policies. It reads more like a statement of intent. <\/p>\n<p>It is not clear what consequences, if any, this variation in policy and institutional depth may have for <a href=\"https:\/\/www.congress.gov\/crs-product\/R47599\" rel=\"nofollow noopener\" target=\"_blank\">AUKUS Pillar II<\/a>, which involves cooperation on acceleration and rapid integration of AI and autonomous technologies. <\/p>\n<p>The heightened significance of national frameworks<\/p>\n<p><a href=\"https:\/\/www.justsecurity.org\/129936\/third-reaim-summit\/\" rel=\"nofollow noopener\" target=\"_blank\">International efforts<\/a> to govern military AI are potentially <a href=\"https:\/\/www.justsecurity.org\/132504\/ai-hype-2026-reaim-summit\/\" rel=\"nofollow noopener\" target=\"_blank\">losing momentum<\/a>. Multinational discussions on autonomous weapons are also <a href=\"https:\/\/www.reuters.com\/world\/progress-rules-lethal-autonomous-weapons-urgently-needed-says-chair-geneva-talks-2026-03-03\/\" rel=\"nofollow noopener\" target=\"_blank\">deadlocked<\/a>.<\/p>\n<p>This means national policy frameworks take on greater significance, shaping <a href=\"https:\/\/www.sipri.org\/commentary\/essay\/2025\/military-ai-responsible-procurement\" rel=\"nofollow noopener\" target=\"_blank\">procurement<\/a> and signalling to partners what a state considers acceptable practice.<\/p>\n<p>Contemporary uses of military AI in ongoing conflicts \u2013 in Iran, in Lebanon, in Gaza, in Ukraine \u2013 remind us governance is not an abstract policy exercise. <\/p>\n<p>Australia\u2019s new policy settings are an important step. The test will be whether they are followed by implementation measures robust enough to effectively govern the development and use of military AI.<\/p>\n","protected":false},"excerpt":{"rendered":"Artificial intelligence (AI) is playing a central role in the ongoing Middle East war. The United States, for&hellip;\n","protected":false},"author":2,"featured_media":562537,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[4],"tags":[64,63,44],"class_list":{"0":"post-562536","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-australia","8":"tag-au","9":"tag-australia","10":"tag-news"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts\/562536","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/comments?post=562536"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts\/562536\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/media\/562537"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/media?parent=562536"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/categories?post=562536"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/tags?post=562536"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}