If you鈥檝e seen the news lately about Russia鈥檚 bombardment of Ukraine with drones, or the videos of the Boston Dynamics robot dog with a machine gun welded to its back, then you know the time of robots in warfare is upon us.
According to Dr. Maureen Hiebert, PhD, an associate professor in the Department of Political Science at the 六九色堂, many countries, including the United States, Russia, and China, have well-developed plans around the increased use of artificial intelligence, machine learning, robots and other autonomous systems in combat.
鈥淭he debate isn鈥檛 so much about whether this technology will be used militarily 鈥斅爐hat has already been effectively decided and planned for,鈥 says Hiebert, who is also the acting director and graduate program director for the Centre for Military, Security, and Strategic Studies.
鈥淚t鈥檚 more a matter of the level of autonomy and how much human control there will be over these systems.鈥
'Different form of warfare' than we've ever seen
Hiebert says there is reasonable concern about the development of technologies that use lethal force where the entire decision-making process about whether to use that force belongs to the machine.
鈥淭hen I think we鈥檙e in a completely different form of warfare we鈥檝e never seen before,鈥 she says.
You鈥檇 have a situation where, for the very first time, the tools of war become combatants themselves.
A secondary concern people have raised is the decisions made by autonomous systems are built upon complex sets of data and machine learning, so humans can only understand the outcome and not the decision-making process.
For these reasons, efforts have been made by the International Committee of the Red Cross for states to impose legally binding rules on the use of autonomous weapons. However, Hiebert says these efforts have floundered as war-fighting states are the least keen on having rules put in place regarding the level of human control.
This is because in combat situations any semi-autonomous system with more levels of human control requires communication between human and machine, which is then prone to enemy jamming or hacking.
If you鈥檙e thinking that humanity itself could band together to prevent the rise of 鈥渒iller robots,鈥 not so fast.
Autonomous technology development on parallel tracks
Hiebert says that because many of the advances in areas like AI and machine learning are coming from the civilian sector, we are already familiar with the technology and use it in our day-to-day lives.
In research she is working on, she imagines autonomous technology development along two parallel tracks. On one, there will be the civilian world and uses of technology we will be familiar with, and on the other track will be the military applications of the same technology.
鈥淎lready this kind of technology is so suffused in how we carry out our day-to-day lives that stigma won鈥檛 be there because we鈥檙e going to be using exactly the same kind of technology,鈥 Hiebert says.
While the combatants may be robotic, the determining factor of who wins will still be very human. Hiebert says the foundational idea behind winning a war is you have eliminated your opponent鈥檚 ability and willingness to continue to fight, so it will still ultimately come down to human commanders and politicians on whether the conflict will continue.
鈥淭hat dynamic and logic of warfare will probably remain the same,鈥 she says.
Use of force could be seen as 'clean and safe'
Instead, the concern should lie with the prohibition on the use of force, which is part of international law except in cases of individual and state self-defence, becoming less binding. Hiebert says state and non-state actors may be more willing to use force against each other because the human cost to their own side is lessened when using autonomous fighters.
鈥淭here may be a greater willingness to resort to the use of force because it鈥檚 not going to have the same bad outcomes as traditional warfare,鈥 she says.
Hiebert points out that proponents of autonomous weapons say these systems will only become more precise as well, so civilians will be more safeguarded on the battlefield.
However, she says this is also concerning because that may further loosen the prohibition on the use of force because it鈥檚 seen as clean and safe.
鈥淵ou could have civilians going about their daily lives and over in the corner a bunch of autonomous systems could be duking it out,鈥 says Hiebert.
鈥淚f we start thinking of warfare like that it becomes the gamification of war. I think that鈥檚 very troubling.鈥