Red Lines & AI Warfare

It is yet to be seen if Australia’s policy on weaponising AI will enshrine the moral red line of delegating life and death decisions to machines, writes Matilda Byrne.

Australia appears to be moving head first into unregulated AI warfare and this shocking possibility almost never makes the mainstream media.

In the last decade, there’s been a massive acceleration in the militarisation of artificial intelligence (AI).

Addressing the United Nations Security Council in late September, Australian Foreign Minister Penny Wong stated that, “decisions of life and death must never be delegated to machines, and together we must set the rules and establish the norms.” 

The advancement of AI in weapons is occurring across all areas: autonomous-piloted aircraft, drones, submarines, robot tanks and guns.

This has raised ethical, legal and humanitarian concerns, discussed by governments, industry, experts and civil society, yet remain to be addressed fully internationally or domestically.    

Wong’s remarks echo sentiment from a growing number of countries, U.N. officials, experts and the public in relation to autonomous weapons.

The U.N. secretary general has repeatedly called lethal autonomous weapons systems (LAWS) “politically unacceptable and morally repugnant.” However, highly militarised countries are pushing forward with advancement, in particular China, India, Israel, Russia and the U.S. 

Increasing autonomy in weapons risks the erosion of human decision making as well as security and humanitarian risks, issues that these countries and much of the mainstream debate is ignoring.

“The U.N. secretary general has repeatedly called lethal autonomous weapons systems (LAWS) ‘politically unacceptable and morally repugnant.’ However, highly militarised countries are pushing forward with advancement, in particular China, India, Israel, Russia and the U.S.”

Removing humans from critical functions, in particular choosing who or what and if and when to attack, facilitates intensified humanitarian harm in conflict.

Furthermore, with escalation, increased speed and scale, and the exacerbation of discrimination with racial, gender and other bias, there’s a major risk if autonomous weapons are used against humans.

This is why specific prohibitions and obligations are being called for in relation to autonomous weapons.

Australia has made huge investments in autonomy for defence, and yet there is barely any public or political debate. The Albanese government is resisting new legal regulations.       

Australia and Increasing Autonomy Globally

In Australia, development of AI and autonomy in the military has expanded with the government and private sector both charging ahead.

Deputy Prime Minister Richard Marles has said that, “Australia must invest in the transition to new and innovative technologies for our Defence Force” concerning the establishment of the Advanced Strategic Capabilities Accelerator in which AI is a dedicated priority area. This is a principal funding mechanism for military AI projects as part of the wider agenda of advanced capabilities.

Autonomy is being introduced to functions such as self-piloting aircraft, submarines or tanks. Most alarming is the use of algorithms or other autonomous systems developed for targeting and attacking.

In an autonomous weapon without human control, an attack would be made based on sensor data, such as facial recognition, heat imprint or acoustic signature, rather than a human operator determining exactly when, where or against what an attack is made. 

“Uses of AI in conflict accelerate the speed and scale of killing, intensifying conflict and result in mass humanitarian harm.”

Australian original company Athena AI has pursued the area of AI military targeting software. Athena AI was acquired earlier last year by a U.S. company now operating as Sightline Intelligence with an Australian office, advancing their AI-driven technologies for military platforms. 

Their software is also now publicised as integrated into an armed autonomous land vehicle ‘Warfighter UGV,’ developed by another private Australian arms company, Cyborg Dynamics.

Australian arms companies including Cyborg Dynamics, C2 Robotics, Defend Tex and Skyborne Technologies are advancing autonomy in weapons systems on a slippery slope with no regulatory limits.

In September, Ukrainian President Volodymyr Zelensky warned that, “it’s only a matter of time … before drones are fighting drones, attacking critical infrastructure and targeting people … fully autonomous and no humans involved.” 

Ukraine is one of several conflict zones experiencing increasingly autonomous warfare at human cost.

AI tools, specifically ‘Lavender’ or ‘the Gospel’, used by the Israeli Defence Force in Gaza to produce target lists of individuals en masse based on AI-generated ratings are the first reports of AI target generation systems.

These uses of AI in conflict accelerate the speed and scale of killing, intensifying conflict and result in mass humanitarian harm.

Intertwined With the US


War Room, Publication of the US Army War College, August 2025.

Australia’s advancement of weapons development and increasing autonomy is intertwined with international partners, especially the U.S. Ties with the U.S. influence Australia’s activity in military AI.

This has increased with AUKUS, a security agreement  between Australia, the U.K. and the U.S. that limits Australia’s independence and increases Canberra’s reliance on Washington.

Australia hosted “Exercise Autonomous Warrior” as the first joint-military exercise in a series as part of AUKUS called “Maritime Big Play” in 2024.

The AUKUS partners announced the series with aims to “enhance capability development and improve interoperability between the partners,” focused on the maritime environment and AI and autonomy.

Maritime Big Play is associated with pillar II of AUKUS which promotes advanced technology sharing. AUKUS pillar II impacts military initiatives, government priorities and business by private sector companies. 

The government reported that during ‘Exercise Autonomous Warrior,’ “Australian-developed capabilities trialled included the long-range loitering strike glider OWL-B; the uncrewed surface vessel Bluebottle; the extra-large autonomous underwater vehicle Ghost Shark; and the large uncrewed underwater vehicle, Speartooth.”

Regarding the exercise, Minister for Defence Richard Marles emphasised cooperation, saying, “Pillar II is a generational opportunity for our three nations to harness and uplift our collective innovation enterprises and industrial bases.”

In terms of influence on the industry landscape, ‘Ghost Shark’ is the product of a contract to American company, Anduril. The project was central to Anduril’s expansion into Australia. 


Anduril Ghost-X UAS, used primarily for surveillance, being prepared for flight, Romania, November 2024.

Anduril specifically mentioned AUKUS and how “technologies like artificial intelligence, cost-effective autonomous unmanned systems, and next-generation networked weapons are among the top priorities for the Australian Defence Force.” 

Anduril is a controversial company, deeply embedded with the U.S. security state and complicit in war crimes across the globe including in Sudan, where the United Arab Emirates supplies weapons used in the civil war, some produced through the partnership between Anduril and the United Arab Emirates weapons conglomerate EDGE Group. 

A partnership between the Royal Australian Air Force and Boeing to develop an autonomous combat aircraft ‘Ghost Bat’ has seen Boeing establish its first manufacturing facility outside of the U.S. The facility is being built at the Wellcamp Aerospace and Defence Precinct outside of Toowoomba in Queensland.

The expanding presence of multinational arms companies in Australia is growing with advanced technologies and U.S. collaboration, and raises questions around who really benefits, Canberra or Washington? 

Other multi-national arms companies operating in Australia, such as Lockheed Martin, BAE Systems and Thales, have projects involving autonomy. Advances Systems and Technology (formally STELaRLab) is Lockheed Martin’s first RnD facility outside of the U.S.

It partners with higher education institutions through scholarships and internships. In addition, collaborations between Defence Science Technology Group, arms companies and universities see research projects on technologies related to different technology and components.

The lack of clarity in Australian policy is alarming given the acceleration of R&D and international collaboration. There must be adequate legal and ethical standards to act responsibly.

No Treaty Wanted 


Australian PM Anthony Albanese, then U.S. President Joe Biden and then British Prime Minister Rishi Sunak at a press event for AUKUS in San Diego, March 13, 2023.

All three AUKUS partners remain among the small handful of countries that reject the need for a new treaty to address autonomous weapons. 

AUKUS partners would benefit from new international law that clearly sets out prohibitions and obligations. This would translate to shared requirements that delineates what is acceptable rather than undertaking joint-initatives in an unregulated environment, with different interpretations of international humanitarian law.

The ADF and Department of Defence use the‘ System of Control’ framework for weapons systems, which does not address requirements on the aspects of human control in the design and use of weapons.

Despite the creation of an ethical AI checklist for Defence, this is yet to be formally adopted. The government is also yet to acknowledge fully autonomous weapons which target humans as morally and legally unacceptable.

Australia Resisting Diplomatic Progress

Australia continues to resist new rules to address autonomous weapons despite the fact that international momentum is building towards negotiations of new international law.

Countries have agreed to treaties on various weapons in many previous instances, including chemical weapons, biological weapons, landmines, cluster munitions and nuclear weapons.

In November, 44 countries expressed readiness “to move ahead towards negotiations” based on a ‘rolling text’ which has been developed and debated by the ‘Group of Governmental Experts’ this year. 

This was stated in a working paper to a meeting of the Convention on Certain Conventional Weapons (CCW) which hosts the Group of Governmental Experts, the dedicated U.N. meetings to discuss autonomous weapons.

Australia has repeatedly expressed dedication to the Group’s work and the CCW forum but failed to endorse the paper. This came after Australia had stated its desire to work with other countries to ‘contemplate next steps’ for the Group, in a statement at the U.N. General Assembly in October.  

Australia’s claim to support progress appears empty when they immediately shirk the growing calls and initiatives looking to negotiations. In November, the U.N. General Assembly First Committee passed a resolution on autonomous weapons. A core aspect of the resolution supported work being done in the CCW to draft elements of an instrument with a view to negotiations. 156 countries voted in favour, including Australia.

However, Australia was part of a group of states who underscored that “any future negotiations must take place within the framework of the Convention on Certain Conventional Weapons (CCW) and must not prejudge the nature of their outcome.” Yet, Australia is not supporting moves in the CCW to take next steps with other countries.

At the U.N. General Assembly, Australia also urged that “it is our solemn responsibility to uphold and promote IHL [international humanitarian law].” Regarding international humanitarian law  and autonomous weapons, the International Committee of the Red Cross asserts that new legal rules are needed to strengthen  the law, yet Australia rejects this. Australia repeatedly pushes back on the need for specific prohibitions that would require human control, with very few countries arguing this.

There was an increase in statements on autonomous weapons at the U.N. General Assembly in 2025, both in debates on security and disarmament as well as the general high-level meetings. 

In contrast, Australia was very brief in commenting on autonomous weapons, only talking about the Group’s work but not any aspects of the issue itself. This is a failure to acknowledge any moral, legal and humanitarian risks at such a crucial time. 

In its opening statement on security and disarmament, Australia affirmed it “is committed to the responsible use of AI in the military domain, without mentioning autonomous weapons. It was a co-sponsor of the resolution on this issue.

Responsible use of AI in the military domain [REAIM] is an area recognised by all states, but has yet to develop more than general principles and does not take into account autonomous weapons.

Countries leading this initiative include the Netherlands, South Korea and the U.S. which initiated a Political Declaration in 2023 that Australia has signed. The international REAIM Summit on responsible use of AI in the military meets this week in Spain.

Australia also stated that “AI technologies must comply with international law, including IHL and international human rights law, throughout their life cycle.”  Despite this, Australia is resistant to move to negotiations of a new legal instrument addressing autonomous weapons.

Looking forward  

The commitment to establish an instrument on autonomous weapons is growing globally and 2026 is of great interest, with the end of the Group of Governmental Experts’ current mandate. 

Australia’s participation will demonstrate whether the government’s position aligns with the moral and legal stance expressed by Minister Wong.

Similarly, the development of capabilities is  only advancing, as well as collaborations through AUKUS. However, opposition to AUKUS including expert analysis and public opinion is increasingly showing concern regarding the agreement. 

It is yet to be seen if Australia’s policy to guide how it engages in both domestic development and global collaboration will enshrine the moral red line of delegating life and death decisions to machines.

Source: Consortium News.

ОК
This website uses cookies to ensure you get the best experience on our website.