(WorldEconomicForum) - Humanity is faced with a grave new reality – the rise of autonomous weapons. It may sound like a Hollywood script, but the risk is real: humans so far removed from wartime choices that life-and-death decision making is effectively left to sensors and software.
What sort of future looms if the taking of a human life is relegated to algorithms? Robots and computer systems lack the distinct and unique human ability to understand both the complex environment they’re operating in and the weapons they’re using.
Moreover, the more complex weapon systems become – for example incorporating machine learning – and the greater freedom they are given to take action without human intervention, the more unpredictable the consequences in their use.
We at the International Committee of the Red Cross believe that the responsibility for decisions to kill, injure and destroy must remain with humans. It’s the human soldier or fighter – not a machine – who understands the law and the consequences of violating it, and who is responsible for applying it. These obligations cannot be transferred to a computer program. Governments – with the input of civil society and the tech industry – must waste no time in agreeing to limits on autonomy in weapon systems.