How will the war develop in the future?

Wars of the future Killer robots on the rise

From dpa / Markus Brauer
Markus Brauer (mb)profile
 

Smart weapons - that sounds clever. But behind it are "killer robots" weapons that select targets and fire themselves. Doesn't that have to be banned? Diplomats and experts are currently wrestling with this in Geneva. The contracting states meet in November. But there is no rapprochement in sight.

Geneva - machines that are sent to war and choose targets and kill themselves. Progress or horror? What sounds like a science fiction film has long been in development. “Lethal Autonomous Weapons Systems” - are meant - also called killer robots. That could be shooting robots, deadly drones, unmanned submarines. They are not directed by humans in combat, but decide autonomously what a legitimate target is and fire fatal volleys.

Since the end of August, diplomats from dozens of countries have been wrestling in Geneva for rapprochement in the high-tech arms dispute. A decision will be made in mid-November at the earliest, when the contracting states meet in the Swiss metropolis.

"Weapons cannot distinguish between friend and foe"

Critics are extremely alarmed. "Weapons cannot distinguish between friend and foe and should be examined under international law," says Thomas Küchenmeister from the German organization Facing Finance, member of the international campaign against killer robots ("Campaign to Stop Killer Robots"). A decision to wipe out human life should never be left to a machine.

Autonomous weapons are made possible by the rapid development of artificial intelligence. Using data fed in, computers learn what a target looks like, how it moves, when it should be attacked and detonate without a human being involved in the decision. This is to be distinguished from automatic weapons such as Patriot missiles. They shoot automatically, but the target must be programmed precisely by humans beforehand.

“There is a gray area between automatic and autonomous weapons,” says Michael Biontino, until recently the German disarmament ambassador in Geneva. "Autonomous weapons do the target recognition themselves, they have no target library saved."

"Strictly restrict attacks to military targets"

There is little doubt that the US, Russia, China, Israel, South Korea and the UK are working on such systems. They already existed, says Neil Davison of the International Committee of the Red Cross (ICRC). It monitors compliance with international humanitarian law and the internationally recognized Geneva Conventions, and is concerned about developments.

"Attacks are to be strictly limited to military targets," says the Geneva Conventions, for example. Can machines decide that? “People have to be in control to make legal decisions,” Davison says.

The weapons of the new kind raise tons of questions: Can they tell whether an enemy is about to surrender or is injured? Whether the recognized person has a weapon, but is not a soldier but a hunter? Could the recognized soldier be a comrade on your own side? Who can be held responsible for crimes involving weapons that are no longer controlled by humans?

Germany rejects autonomous weapon systems

"The line of the Federal Republic is clear: for us, the decision about life and death cannot be transferred to a machine," said Biontino in the spring. It even says in the coalition agreement: “We reject autonomous weapon systems that are beyond human control. We want to outlaw them worldwide. ”Together with France, Germany has proposed a code of conduct according to which all current and future weapon systems must be subject to human control.

That is a toothless tiger, but says Thomas Küchenmeister. “A code of conduct is not binding under international law.” The campaign against killer robots requires a binding contract. But many countries do not want to be restricted in their weapons development. You are in no hurry to negotiate. "This time game is very risky when you see the degree of autonomy that has already been achieved."

More than 2000 scientists working with artificial intelligence have condemned such weapons. “There is a moral component,” they wrote in an appeal. “We must not allow machines to make decisions about life and death for which others - or no one - are made criminal.” The USA has the largest arsenal of combat robots

Futuristic visions are already a reality

Combat robots are unmanned, remotely controlled or semi-autonomous (partially autonomous) systems that are used for observation, reconnaissance, mine clearance and combat against military targets. In 1971, the US conducted its first successful tests with armed drones, but it was not until 2001 that they were used in Afghanistan. The US armed forces today have the largest and most modern arsenal of robots, including remote-controlled flying objects and ground vehicles equipped with automatic weapons or missiles.

Unlike human soldiers, robots have no fear, never get tired, fight without scruples and fear of their own death. The American political scientist Peter W. Singer, one of the leading experts in the field of automated warfare, is convinced that the 5,000-year-old monopoly of people to fight in war is collapsing. Future wars will be determined by machines. "If people see war as something that does not cost them anything, they are more willing to wage it."

New arms race

For the military, the vision of a war without losing their own troops is tempting: machines can be used and replaced as required without public protest at home at the sight of body bags.

Peace researchers like Noel Sharkey, an expert in artificial intelligence and robotics at Sheffield University, and Jürgen Altmann, physicist at TU Dortmund University, fear that the growing use of robots could undermine international human rights, provoke new wars and fuel a new arms race.

Many countries are already using military robots today. Most robots are unarmed systems: caterpillars, hardly larger than a go-kart, that disarm bombs or are used for reconnaissance, and unmanned missiles.

USA is a leader in military robotics

The US military has the largest and most modern arsenal of robots, including ground vehicles like the Talon Swords or the further development Maars, which can be equipped with automatic weapons or missiles. Walking robots like Spot, which is the size of a Rottweiler, are said to be used for reconnaissance in dangerous terrain. Systems like “Locust” and “Coyote” are mini-drones that attack enemies like a swarm of locusts and fight them autonomously.

In addition to the USA, Russia, China and Israel are particularly ambitious in the field of automated warfare. Regardless of the technical difficulties of developing systems with artificial intelligence and moral code, the US Department of Defense assumes that the global armaments dynamics will make autonomously firing robots necessary within the next 30 years.