This summer, the fatal shootings of unarmed black men (in St. Paul, Minnesota, Baton Rouge, and elsewhere) and of police officers (in Dallas and Baton Rouge) reenergized the national debate surrounding race relations and the use of force by police in the United States. Yet amid the now all-too-familiar characters ? unarmed black men, armed police officers, bystanders with cameras ? emerged a new one: a robot with the power to kill.
Micah Xavier Johnson, the Dallas shooter, will go down in history as a domestic terrorist who killed five officers and wounded nine others. He will also go down in history as the first person killed by an armed police robot. Johnson was killed in a standoff when Dallas police sent in a remote operated Remotec bomb disposal robot that had been jury-rigged to carry a pound of C-4 plastic explosives.
It is easy to Monday-morning-quarterback the judgment call of SWAT team tactics in Dallas. On one hand, there was an active shooter who had just murdered five police officers protecting a peaceful protest. “Other options would have exposed our officers to grave danger,” Dallas Police Chief David Brown said at a subsequent news conference.
On the other hand, the shooter was contained and potentially could have been waited out or incapacitated ? possibly even by a robot carrying a nonlethal weapon, such as a “flash bang” grenade.
But to get caught up in that narrow discussion about the tactics in one encounter is not just to second-guess those on the scene but to miss the bigger issue. Robotics are here to stay, and the questions over their use will only grow more pressing. What?s more, the discussions we need to have tie back to the societal debates over policing and race relations that got us here in the first place.
Robocops aren?t the future; they?re here
Robocops and robosoldiers may seem like science fiction, but they are already a reality. Some 10,000 unmanned aerial systems (a.k.a. drones) and another 12,000 unmanned ground vehicles now serve in the US military. The US is hardly alone; more than 80 other countries also rely on robotics in their military.
Ground-based robots can be car-size, or they can be beer can?size “throw bots” that can be tossed through a window; upon landing, these bots then crawl about, inspecting their surroundings. The system used in Dallas, a Remotec Mark V, weighs just under 800 pounds and is about the size of a large lawnmower.
Despite this diversity, these robots primarily take on two roles: surveillance and explosive-ordnance disposal. That is, they either gather information or help defeat bombs (or both). A system like the Remotec robot might creep up close to a suspicious package. If it turns out to be an explosive, the human operators might defeat the bomb by having the robot tear it apart, shoot high pressure water into it, or even place a small explosive beside it to blow it up in a controlled manner.
As this tech proved itself in Iraq and Afghanistan, it also spread to police forces. Hundreds of robotic systems are now used by bomb squads and SWAT teams in much the same way by nearly every major city and many minor police departments ? just as these departments also use surveillance drones.
This technology is becoming become more and more common in the civilian world outside of police departments, too. Driverless vehicles are under development at traditional car companies like Ford and Toyota as well as at tech companies like Google, Uber, and Tesla, while an autonomous delivery robot was just authorized to drive on Washington, DC?s sidewalks. Designed by the Estonian robotics company Starship Technologies, the robot can carry the equivalent of roughly two shopping bags full of anything you want.
As with drones, technology is outpacing public policy and the law
In all these areas, however, the tech has moved faster than the public policy. Without any congressional debate, we began using robotics to carry out strikes into countries with which we were not at war. This meant we?ve conducted more than 500 drone strikes into Pakistan under rules of engagement only released by the White House years after the fact. So, too, in Dallas, the decision to use the robot to kill the gunman didn?t reflect any set policy or doctrine but was made in the midst of the standoff.
On the civilian side, we?ve seen our streets begun to be treated as the sites for “beta tests,” where robotic tech gets pushed out for customers to vet and improve, just as with any other app. The difference in the case of robotics is that we?re talking about physical objects, not just software. If things go wrong, people can get hurt or even killed ? as when a Tesla car, driving in “autopilot” mode, crashed into a truck in Florida. (The cause of that crash remains under investigation.)
Source: vox.com