Marsden Fund grant supports UC study on ‘killer robots’ debate
First there was gunpowder, then nuclear weapons. Now another revolution in warfighting is underway with the rise of algorithmic warfare using robots and precision targeting technology. Their use is highly controversial but to date there has been no systematic attempt made to map the debate unfolding in academic, intergovernmental, corporate and social media domains. Internationally, the use of lethal autonomous weapons (LAWS) or so-called ‘killer robots’ is being debated at the highest levels. A new University of Canterbury (UC) study, which has recently received a generous grant from the 2019 Marsden Fund/Te Pūtea Rangahau a Marsden, aims to map and analyse this fast-evolving debate. Its findings could help shape the development of effective and ethical regulation of these sophisticated weapons. Dr Jeremy Moses and Associate Professor Amy Fletcher, from the Department of Political Science and International Relations at UC, and Dr Geoff Ford, a political scientist with the UC Arts Digital Lab, aim to delve deeply into this issue in a three-year project starting this year. Their research has attracted a prestigious 2019 Marsden Fund grant of $842,000. Around 12 countries are known to be developing LAWS, among them the United States, China, Russia and Israel. While the perception in those nations is that LAWS are needed to maintain geopolitical balance, groups such as the Campaign to Ban Killer Robots and other international bodies are seeking a global ban on their use. “Proponents argue that autonomous weapons could decrease civilian casualties through enhanced targeting precision and reduce the risks for human soldiers,” Associate Professor Fletcher says. “Opponents fear a world of ‘algorithmic warfare’ in which robots can make decisions to kill in the absence of human oversight and in which the speed and complexity of war accelerates to the point that international rules of conduct are rendered irrelevant.” This project […]