Russia has been getting a lot of bad press lately, much of it richly deserved, IMHO. Since this opinion is widely shared, it might be tempting to try to pin every kind of villainy on Vladimir Putin, especially if the goal is to vilify the villainy by associating it with a known villain, rather than the other way around.
So let me be clear: this isn’t about fairness, and it isn’t any enthusiasm on my part for Russian aggression nor, lord knows, for autonomous weapon systems (AWS). But given the evidence that I have seen, I think it would be a bit premature to credit Russia, as David Hambling did in New Scientist, with “taking the lead in a new robotic arms race” while the “squeamish” West holds back, or to accuse Russia of “norm anti-preneurship” aimed at disrupting “international norm-building efforts to regulate the deployment of fully autonomous weapons,” as UMass Prof. Charli Carpenter did in her blog, citing Hambling’s report. [Note: After I emailed her about the unreliability of Hambling’s spin, Prof. Carpenter updated her post to reflect “uncertainty about the nature of Russia’s actual announcement”. Besides, she’s done great work on this issue.]
No, I’m not out to defend Russia, which if not actually taking the lead, is openly declaring its intent not to be left behind. Last June, Deputy Prime Minister Dmitry Rogozin, in charge of the defense industry, announced the decision to establish a national laboratory for military robotics, presumably similar to the US Navy’s Laboratory for Autonomous Systems Research, which opened in 2012. Rogozin was previously associated with the push to create a Russian version of DARPA, the agency that oversees a great deal of the Pentagon’s cutting-edge robotics research. In December, Rogozin announced plans to spend $640 billion on military modernization through 2020, with an emphasis on robotics. Most disturbingly, in a March 21, 2014 article in Rossiya Gazeta, Rogozin summoned Russian industry to “create robotic systems that are fully integrated in the command and control system, capable not only to gather intelligence and to receive from the other components of the combat system, but also on their own strike [emphasis added].”
The risk here is that tarring Russia as the Mordor of the robot orcs will contribute to the misperception that the US and its allies are any less bullish on AWS than Russia has lately advertised itself as being. This could be harmful in two ways: It could bolster America’s own full-speed-ahead policy for development and use of AWS – which, contrary to popular belief, does not slow the development of fully autonomous weapons – and it could push Russia into a defensive corner in diplomatic discussions, such as those upcoming next week in Geneva (which both Charli and I will be attending). Russia will be a participant in those discussions, and previous Russian statements at the UN have been supportive of emerging public concerns about the issue, though not any particular norms.
I can hear it already: “You want to ban killer robots? Tell that to Putin!” Of course, China also serves this rhetorical purpose, and those wishing to point the finger that way can cite Lijian, a stealth drone that looks a lot like the X47B but as far as we know has not yet landed on an aircraft carrier, or Norinco’s new GS1-AT “smart” cluster munition, which resembles an artillery version of the sensor-fuzed weapon that the US has had in service since the 1990s.
In comparison, despite a history with drones dating to the 1950s, Russia “lags behind other militaries in building unmanned aerial combat vehicles, according to U.S. officials”—and when Bill Gertz reports that, it’s reliable. In June, Defense Minister Sergei Shoigu bemoaned Russia’s “technological backwardness” and “shortage of skilled personnel” and called Russian military robots “inferior to their foreign analogs.” Putting an optimistic spin on things, one Russian official told Pravda.ru “From the point of view of theory, engineering and design ideas, we are not in the last place in the world.” Not satisfied with that, Defense Minister Shoigu called for doubling the pace of “developing the combat robotic equipment.”
Russia is expected to soon begin testing an armed drone comparable the Reaper, but it is not expected to be deployed before 2016. I have not found any reports of actual Russian unmanned maritime systems, only talk. In unmanned ground vehicles (UGVs, appropriately enough) the Soviets first experimented with “teletanks” in the early 1930s. The present story stems from Hambling’s account in New Scientist of reports that Russia may soon use armed UGVs to help guard its ICBM sites.
Мобильный робототехнический комплекс
On March 12, Maj. Dmitry Andreev, a spokesman for the Strategic Rocket Forces (RVSN) told RIA Novosti (in Russian) that
“In March, the Strategic Missile Forces began to explore issues of the application of mobile robotic system (military) (MRK BH), created for the protection and defense of the Strategic Missile Forces facilities”
This was part of “retooling” planned for 2014 at five sites “for new types of security systems, including the use of modern technological advance in the development of robotic systems.” RIA’s English version of the story seemed less tentative, stating that “testing” began in March, and the bots “will be deployed at five ballistic missile launch sites.”
TASS (in English) quoted Andreev as saying the MRK BH is designed for
“reconnaissance, spotting and destroying stationary and moving targets, fire support of military units, patrolling and protection of important facilities.”
Furthermore, according to TASS,
It can provide an option to conduct combat actions in the nighttime without de-masking factors and an option of aiming weapons, tracking and hitting targets in automatic and semi-automatic control modes. The advanced combat system is equipped with optical electronic and radar reconnaissance stations.
TASS’s Russian version does not differ significantly in these details. A search in Russian and English did not turn up other independent reports of Andreev’s statement, and in an email to me, Hambling confirmed that he did not have additional details of the statement.
A month later, Novosti VPK, (literally “Military Industrial Complex News”—Russians apparently use the term without irony) reported that tests of new security equipment, including the MRK BH as well as a larger manned system, would be conducted on April 21 and 22 at Serpukhov Military Institute, a branch of the RSVN Academy. Russian TV news reports posted that day, as well as a YouTube video posted on April 25, show an apparent demo-expo of the new security equipment. A man in a black suit grabs hold of a joystick and presses a red button. The robot’s engine starts and it drives off. A squad of soldiers, guns raised, pad behind the MRK. The MRK leads a convoy of transporter-erector-launchers (which shuttle the nuke-lobbing missiles around to frustrate targeting by you-know-who). The MRK looks up, down, left and right. At a target range, it fires its machine gun.
“Mobile robotechnic complex” or “mobile robotic system” (abbreviated variously as MRC, MRK, MPK or MRTK) is actually generic Russian terminology for mobile robots, mostly unarmed, made by several companies. MRK BH appears to be a variant of the MRK-002-BG-57, the sole robotics product listed on the website of the Izhevsk Radio Plant (IRZ), and demonstrated at the Russia Arms Expo last September. A few minor changes are visible; five small housings at the front and sides, possibly for cameras, have been removed, and replaced by four forward-directed headlight batteries and a swivel-mounted camera just below the gun mount (which has its own optical and infrared sighting cameras). A padded skirt, possibly armored, has been fitted around the chassis. IRZ reports the system weighs approximately 1 ton, and has “cruising range” of 250 km at up to 35 km/hr., implying it is powered by a gasoline or diesel engine, audible in some of the videos. It also has batteries that run down after 10 hours, or 7 days in “sleep mode.” A Russian Wikipedia page says the chassis measures 1.7 x 3.7 meters.
All of the videos show the remote control unit, which reportedly has 5 km range. There is no doubt that the system can be fully controlled by a human operator. The interesting question is, Does it also have capabilities for lethal autonomy? If so, how do they work, and how might Russia use them? How does this compare with things that the US and its allies are doing?
How scary is that bear?
Kevin Fogarty in Dice, citing the same RIA and TASS reports, reported that the robots “are able to select and fire on targets automatically.” The word “select” here echoes the language of the Pentagon’s policy directive, which defines target selection, rather vaguely, as “The determination that an individual target or a specific group of targets is to be engaged.” Under the US policy, this is the key thing which a robot is forbidden to do – except when it is allowed to do it. Drawing the comparison more sharply, Hambling wrote
These robots can detect and destroy targets, without human involvement…. Andreyev describes the robots as being able to engage targets in automatic as well as semi-automatic control mode. US policy, on the other hand, says a person has to authorise when weapons are fired. Drones don’t fire missiles on their own, but act as remote launch platforms for human operators.
First, while the statement about drones (today) is accurate, the one about US policy is not. The policy green-lights the development, acquisition and use of weapons which have every technical capability for autonomous target selection and engagement (which includes killing people). Certain general criteria have to be met, similar to criteria set forth for other high-tech weapon systems, and if the system is intended to be autonomously lethal, three senior officials must “ensure” that it meets those criteria. If it does, the system can be approved, and could be turned loose to hunt and kill humans.
In fact the US has a number of systems in use, such as Patriot, THAAD, C-RAM and Aegis, which can make engagement decisions autonomously. These systems are defensive, and do not specifically target humans, but Patriot killed pilots in friendly-fire incidents during the Iraq invasion, and Aegis, operating semi-autonomously, was involved in the Iran Air 655 tragedy. On the offensive side, missile systems are in development which are intended to identify their targets fully autonomously, and would kill people because they would target ships, tanks, planes, or… people. These missiles may be considered “semi-autonomous” under the policy, and thus exempt from senior review. Operators of these systems are supposed to apply tactics, techniques and procedures to ensure that the missiles don’t go after the wrong targets.
Furthermore, if a system is not intended to be used as a fully autonomous lethal weapon system, it can have every technical capability needed to do so, e.g. it can acquire, track, identify, and prioritize potential target groups, and control firing, and still be considered semi-autonomous provided it asks for permission before actually firing. Obviously, a trivial hack would make this a fully autonomous weapon system, and developing such systems means perfecting the technology of fully autonomous weapons.
OK, OK, but what about the bear?
Both Hambling and Fogarty appear to assume that the Russian statement about “automatic and semi-automatic control mode” means the same thing as the Pentagon’s “autonomous and semi-autonomous weapon systems.” But does it?
According to IRZ, the “Robotic system has an automatic capture and the ability to conduct up to ten goals in motion. The aim is held when moving the turntable by 360 degrees.” Another page states that it can “Automatically capture and manage objectives in motion (target is held while moving the turntable 360 degrees).” Together, these two statements indicate that “automatic mode” involves target tracking and automatic fire control while the robot is in motion, but not necessarily autonomous target acquisition and selection, in the Pentagon’s terms.
Further evidence can be extracted from the previously mentioned Russian TV reports of the April tests. There are actually two segments accessible at this url, which run consecutively. In the first, the suited man, IRZ Deputy Director General Alexei Slugin, explains that “for controlling the combat module, we have a touchscreen installed here, where we can set up to 10 targets with our finger, and the targets are then automatically followed in automatic mode… the targets are captured and held.”
In the second segment, the reporter talks to Slugin, then explains that “This is a touchscreen, where you can choose up to 10 targets, then hit the button allowing fire, and the machine will commence destroying the enemy.” He emphasizes that the machine performs “under strict control of the operator. The final decision about the destruction of the target is made by the human.” A normally-covered toggle switch apparently must be uncovered and thrown for the destruction to commence.
However, following him, RSVN officer Sergei Kotlyar explains that “All automated, automatic systems, especially military systems, are piloted only by a human, only the human makes the decision. Other than that, it is fully robotic, and when performing a narrow task, e.g., if it is known that the enemy is firing or the enemy is present, there it will be using firearms itself, calculating targets and firing at them.” This suggests that the system may, indeed, have some capability to autonomously select and engage targets. But this capability is likely crude and indiscriminate. The Russian military, like the US, is sensitive about this, and maintains that the fully autonomous lethality would be activated only in hot combat.
So does this mean Russia is the bad boy on the block?
Would fielding an armed UGV with a full lethal autonomy capability, even if that capability were normally not activated, be a significant step beyond the policies, practices and plans of the US and its allies? Not if the Pentagon policy directive is given a permissive reading – and note that, since the policy is only an internal DoD directive, the Pentagon is free to read it (or not read it) any way it chooses. It is also quite similar to what South Korea and Israel are doing with stationary sentry robots.
In fact, if there is any emerging international norm, it is towards what US Navy scientist John Canning called “dial-a-level autonomy,” i.e. systems that have both human control consoles and communication links for normal operation, plus full autonomous capability for when things get real. Thus Harpy, Israel’s loitering, fully autonomous, radar-hunting suicide drone is being superseded by Harop, which adds a two-way radio control link and electro-optical system. Similarly, following the cancellation of the fully autonomous LOCAAS loitering missile, Lockheed Martin offered SMACM, with the same multimode sensors capable of autonomous target recognition, but adding a two-way radio control link.
Hambling argues that the US has not actually deployed armed UGVs, autonomous or otherwise, despite their having been in development for “decades.” He cites the 2007 trial of three SWORDS robots in Iraq, which was “cancelled,” according to Hambling, due to “uncommanded or unexpected movements.” However, Popular Mechanics, whose initial report was widely misrepresented as having said the bots were withdrawn after swinging their guns at soldiers or even injuring them, later ran statements from SWORDS manufacturer Foster-Miller/QinetiQ saying that SWORDS was not cancelled; the government had never funded more than three of them. Furthermore, the “unexpected movements” had occurred in earlier, stateside tests, and the problems had been fixed. According to a 2013 report from the National Defense Industrial Association, the SWORDSes were in Iraq for six years and “performed a combat role,” specifically “perimeter defense.” It was claimed that they had “saved lives.”
SWORDS was never deployed as envisioned. They carried M249 light machine guns, but were placed in fixed locations, and did not move, according to reports in 2008. Operational concepts would have had them going around buildings to shoot at snipers or other enemy combatants, without exposing soldiers to deadly fire. However, senior military officials at the time did not feel comfortable using them in that manner, and they were placed behind sandbags. [NDIA report]
There could be any number of reasons why the Army chose not to drive SWORDS out on Iraqi streets, including the risk of bad PR both locally and globally – especially if they ended up killing someone accidentally – and because their tactical usefulness is probably very limited. They are heavy, easily damaged, and would need to be carried over obstacles. Lacking peripheral vision, directional hearing, or the instinct and ability to react quickly, they would be sitting ducks in urban combat.
And today, MAARS, son of SWORDS, lives on. In recent years, it has been tested by the Marine Corps and by the Army, although no procurement contracts have been announced. We know that QinetiQ has sold at least one to the US government, and the wording of its MAARS homepage and data sheet strongly suggest that US Special Operations Command (SOCOM) is a prime target customer, so it would not be surprising if some larger number have been acquired under the black budget.
In that regard, Hambling also pointed me to yet another Russian YouTube video, this one a promo for FSB Spetznaz, Russia’s own version of SOCOM. It shows a series of staged vignettes, which don’t even look like serious training exercises, in which shock troops kick the butts of Chechen-looking types. In the first sequence, they storm a house with the help of a small MAARS-like robot, suggesting that such weapons might already be a standard feature of their operational arsenal (although this one may be a movie prop). However, it seems extremely unlikely that gunbots would be firing autonomously in such a scenario, with troopers running in and out and around, nicely choreographed in the video but more likely frantic in real life.
Is there any conclusion?
So now, we have a video which suggests that Russian special ops may have at their disposal small armed UGVs similar to the ones that American special ops may have had since at least 2008, and we have official reports that Russian missile troops are testing armed UGVs in a role similar to the one that American troops tested for six years in Iraq. Does that make Russia a world leader in killer robots?
Arguably it might, if in fact the Russian missile site robots are substantially larger in number or herald an imminent rapid expansion in the use of such systems, or if the RSVN intends to routinely turn the robots loose with full lethal autonomy.
But the latter seems very unlikely, both because the Russians deny it and because, if you did have MRKs patrolling missile bases in fully autonomous lethal mode, they would, with near certainty, kill people. They would kill drunken soldiers, officers whose drivers misread the map, and technicians sent to find out why an MRK was unresponsive. Sooner or later, and probably sooner, somebody would get shot. There is no good reason why the RSVN would do this. It would be insane.
It may also be premature to assume that current testing of the MRK BH will lead to its permanent adoption by the RSVN, let alone a major and rapid expansion of UGV deployment by the Russian military. The aforementioned Russian Wikipedia article (6 May 2014) is openly scornful of the MRK BH, saying it is poorly designed “to work as part of the division tactical units,” that IRZ lacks experience in military robotics, and that the design harks back to Russian and German remotely operated tanks of the 1940s. While this is only anonymous commentary, it suggests that at least some Russian military observers are skeptical and that the MRK BH may have drawbacks that will cause RSVN to reject it.
It does appear that overall, Russian industry is still sadly mired in post-Soviet mediocrity, while Putiniks are now demanding modern technology. IRZ seems to have rushed a crude model into early production, and it has been put on display as a symbol of new progress, but what it mostly reveals is not so much Russian aggression (the military officers interviewed in the videos all seem ambivalent) but rather a grim determination to meet whatever challenges a new day brings. If there is going to be a robotic arms race, Russia will not take last place in the world.
Meanwhile, the MRK BH is already world famous, so it may as well become a source of pride for Russian military robot enthusiasts. For example, the weekly “Military Industrial Courier” (vpk-news.ru) reported that “Russia took the leading position in the world in the key area of advanced weapons – ground combat robots.” Its source? David Hambling’s story in “the influential British science weekly” New Scientist.
I am deeply indebted to Tanya Lokot for translation of the Russian videos.