Fully autonomous weapons raise serious questions of accountability because it is unclear who should be held responsible for any unlawful actions they commit. Footage from the movie, Teminator Salvation. Source: Kinopoisk
On May 17, Deputy Prime Minister Dmitry Rogozin, who oversees the defense industry, said Russian experts are developing robots designed to minimize casualties in terrorist attacks and neutralize terrorists. They will assist to evacuate injured servicemen and civilians from the scene of a terrorist attack.
Rogozin added that other antiterror equipment Russia is developing includes systems that can see terrorists through obstacles and effectively engage them in a standoff mode at a long distance without injuring their hostages.
The equipment might be deployed by Russia’s security and intelligence services, but the date is yet unknown.
Earlier this year, Russian Ministry of Defence decided to start “robotization” of most life-threatening military operations. It will not only help save the lives of soldiers, but will also spare billions of rubles. Some of the projects may be used both in military and antiterror activities.
Testing of the multipurpose MRK-27 BT (“Point of Combat”) robotic soldier, which was developed by the Applied Robotics Laboratory of the Bauman Moscow State Technical University, has been underway since 2009.
The MRK-27 BT is similar to the American SWORDS robot. The Point of Combat is a mobile track-type chassis with an entire arsenal of assault weapons mounted on it: a Pecheneg machine gun, two RShG-2 grenade launchers, two Shmel flame-throwers and six smoke-screen grenades.
Russia’s Defence Ministry is highly interested in deployment of robotic systems to conduct critical mine clearance missions in Chechnya, Chief of General Staff Valery Gerasimov said.
Attention is also being paid to the development of small surveillance robots, which could be used, for example, to stake out a house seized by terrorists, by hovering outside a window.
Meanwhile, using fully autonomous weapons is criticized by Human Rights Watch. According to it, a fully autonomous machine that could select and fire upon a target selected of its own volition could be available within 20 years, if not sooner.
Current semi-autonomous technology at least requires a human operator, and in theory does not kill without proper authorization. As Human Rights Watch points out, the development of artificial intelligence-equipped machines presents a number of issues circumventing conventional law, and might even lead to an international arms race not unlike the one currently happening within the field of drone tech.
“These weapons would be incapable of meeting international humanitarian law standards, including the rules of distinction, proportionality, and military necessity. The weapons would not be constrained by the capacity for compassion, which can provide a key check on the killing of civilians,” the human rights watchdog said. “Fully autonomous weapons also raise serious questions of accountability because it is unclear who should be held responsible for any unlawful actions they commit.”
All rights reserved by Rossiyskaya Gazeta.