.
.
.
.

Robot Killers Qu’est-ce que c’est

Robots are already widely used in war, but what when they start to think for themselves?

Yossi Mekelberg

Published: Updated:

Innovation is a quality usually praised and celebrated in modern societies. It is commonly associated with development, progress and improvement in the human condition. However, technological innovation also has been employed since antiquity in wars to inflict casualties and destruction on enemies.

The combination of new and more powerful weaponry, strategies and organization has made the modern battle field more lethal than ever. Following, the introduction of nuclear weapons, humans possessed for the first time the capability to destroy the entire planet, reduce it to ashes and bring the existence of humankind to an end. Fortunately this has not happened so far.

Nonetheless, unmanned weapon systems are increasingly being introduced into the modern theatre of war, and this situation presents some profound ethical and legal issues. In its softer version these weapons such as drones are mainly remote control operated. In its extreme and eerie form, future battlefields would be packed with fully autonomous weapons, which would possess almost a mind of their own - or at least the one that they were programmed with. In other words, these systems will operate independently, taking decisions of life and mainly death in the absence of human judgment.

‘Frankenstein-like military monsters’

A scenario which sounds like that of a nail-biting Hollywood thriller, might turn into a ghastly reality. Frankenstein-like military monsters with no human accountability and no regard for international humanitarian law, might go on the rampage, unless there is an international coalition to stop this from happening.

Frankenstein-like military monsters with no human accountability and no regard for international humanitarian law, might go on the rampage

A new and alarming joint report by Human Rights (HRW) Watch and the International Human Rights Clinic (IHRC) at Harvard Law School was published last week, highlighting some of the legal, moral and technological arguments against developing and operating fully autonomous weapons.

The authors of this report call for a preemptive ban on autonomous weapons, which would pose a radical shift in the way future wars will be conducted, and risk the proliferation of military systems in which the impact on humanity is far from being properly evaluated.

Among the most striking warnings the report presents, is that in the absence of a physical human presence to operate these type of weapons; it would be almost impossible to hold anyone criminally accountable for committing war crimes or crimes against humanity.

‘No retribution for victims’

The report’s lead author Bonnie Docherty remarked that “No accountability means no deterrence of future crimes, no retribution for victims, no social condemnation of the responsible party.”

This can only lead to the undermining, if not collapse of the international human rights legal system, which evolved in the decades following the atrocities of the Second World War. The philosophical underpinnings of this legal-justice system is that every human being is responsible for his or her actions. They are accountable and punishable if they violate international conventions on human rights.

In the post 1945 world, no one should be able to hide anymore behind the flimsy defenses that they ‘were only following orders,’ or that killing innocent people or any other war crimes and crimes against humanity are ‘unavoidable and inevitable,’ in the heat of the battlefield. Who is going to explain this to a robot killer, or hold it accountable?

Would the programmers or the manufactures be held legally and morally responsible? As the authors of this report outline very clearly, this would prove almost impossible under the current international legal system.

‘Slaughtering of people in horrific ways’

In conflicts in the Middle East we are witnessing the use of the most sophisticated high-tech technology side by side very low-tech ones. In the war against ISIS, armies equipped with the latest military technology battle with not only more traditional weapons, but also the slaughtering of people in horrific ways, with knives or by setting them on fire.

It is difficult to imagine that the employment of technologically sophisticated weapons would completely replace the use of more low-tech technology. Nevertheless, the temptation to develop autonomous weapons is rather obvious, and not surprisingly it is led by the United States.

Its declared strategy is to possess a joint military force which is smaller and leaner, but whose “great strength will be that it will be more agile, more flexible, ready to deploy quickly, innovative, and technologically advanced.”

To a large extent fully autonomous weapons fit the bill. Unfortunately, the invention of robot killers gives additional weight to President Eisenhower’s farewell warning about the ‘…acquisition of unwarranted influence, whether sought or unsought, by the military-industrial complex.”

‘Power hunger and greed’

Robot killers merge power hunger and greed of the arms industry with governments’ wish for a powerful army which has fewer personnel, is less vulnerable in terms of its own casualties and is legally unaccountable. It is estimated that the market for this type of weaponry is around $100 billion over the next 10 years.

The report’s lead author Bonnie Docherty remarked that “No accountability means no deterrence of future crimes, no retribution for victims

Although fully autonomous weapons still do not exist, the foundations for developing them are laid by weapons systems such as armed drones. Such weaponry has been developed and deployed not only by the United States but also by countries including Russia, China, the United Kingdom, Israel and South Korea.

The Israeli Iron Dome and the U.S. Phalanx and C-RAM that are programmed to respond automatically to threats from incoming munitions, are given in the joint HRW/IHRC report as an examples of weapons systems in transition towards full autonomy.

‘We might be fascinated by robots’

More than thirty years ago the creators of the movie The Terminator envisaged a world controlled by a robot killer. Intentionally or not, they sent a warning which day by day seems less sci-fi and more realistic: “Listen, and understand. That terminator is out there. It can’t be bargained with. It can’t be reasoned with. It doesn’t feel pity, or remorse, or fear. And it absolutely will not stop, ever, until you are dead.”

We might be fascinated by robots, as leading scientist and artist Ken Goldberg argues because they are reflections of ourselves, but this could be a very dark reflection of human nature when it comes to employing robot killers in the battlefield.

The commissioning, advocating and developing of these moral-less and heartless machines, reflects the dangerous desires of those who would like to acquire maximum power with nil responsibility or accountability. A world which divests international law of any meaning, and where metal components and semiconductors are indicted for war crimes and crimes against humanity by the ICC, while their human makers and commanders relinquish any responsibility.

Unless governments and manufacturers of killer robots are ready to be more transparent about the development of these military systems and how they intend to employ them, the world may face an unfamiliar battlefield which is even more inhumane that the current one.
--------------------------------------------------------
Yossi Mekelberg is an Associate Fellow at the Middle East and North Africa Program at the Royal Institute of International Affairs, Chatham House, where he is involved with projects and advisory work on conflict resolution, including Track II negotiations. He is also the Director of the International Relations and Social Sciences Program at Regent’s University in London, where he has taught since 1996. Previously, he was teaching at King’s College London and Tel Aviv University. Mekelberg’s fields of interest are international relations theory, international politics of the Middle East, human rights, and international relations and revolutions. He is a member of the London Committee of Human Rights Watch, serving on the Advocacy and Outreach committee. Mekelberg is a regular contributor to the international media on a wide range of international issues and you can find him on Twitter @YMekelberg.

Disclaimer: Views expressed by writers in this section are their own and do not reflect Al Arabiya English's point-of-view.