Two weeks ago, I storified a short piece about Adam Elkus’ decision to stop referring to unmanned aerial vehicles/remotely piloted vehicles as “drones.” In it, Adam acknowledges that while it is better to use common language instead of jargon, “drones” is such an inaccurate term and so weirdly distorted from the reality of these machines.
This was highlighted especially well by a tweet from Teju Cole which read, simply, “Drones are guns too.” If this is where the common language takes us, it has become so over broad as to be useless. Certainly, there are some drones that deliver payloads, in much the same way that guns deliver bullets, but saying that “these are both weapons” is not a terribly useful point. At the very least, it misses the defining point of drones, which is not that they are weapons (and plenty aren’t, as they perform surveillance tasks), but that they are unmanned.
This does not mean, not yet, that because the machines are unmanned that they are undirected. The “p” in “RPV” is for “piloted,” and when drones aren’t assumed to be the apocalypse, this slips into the discourse, but often with an aside about piloting drones being like the a video game, as though that is enough to rule it out as an actual, deliberate, human activity.
Take, for example, this cartoon by Matt Bors from early June:
The drones being portrayed here are clearly a caricature of what people think drones can do, but look how autonomous they are. The drone is free from what makes humans expensive, like union membership and pensions, and as an independent actor it has no accountability. Drones freely roam the skies, acting with impunity and reporting back only to their cold mechanical programming. Drone as portrayed may as well be a Terminator or Cylon, for all the control humans have over it.
Drones may some day get to that point. But right now the robots we refer to as drones are piloted. There’s a pension for that pilot, and while there is some ambiguity over accountability when the pilot is a civilian, when they’re a uniformed soldier they fall under the Geneva conventions. And despite critics deriding those pilots as desensitized, video-game killers, those assumptions are increasingly being proven wrong, drone centers are adapting to the same psychological challenges that are encountered in the field.
Although pilots speak glowingly of the good days, when they can look at a video feed and warn a ground patrol in Afghanistan about an ambush ahead, the Air Force is also moving chaplains and medics just outside drone operation centers to help pilots deal with the bad days — images of a child killed in error or a close-up of a Marine shot in a raid gone wrong.
Autonomous killing machines don’t need chaplains, but people tasked with watching and occasionally firing into compounds certainly do. By treating the machine as new and independent, we downplay the very real experiences of the pilots.* The robot discourse, much like the video game discourse, distracts from the unchanged nature of war. It obscures the humans actually executing policy. Most tellingly, it makes US warfighting seem detached from the reality in which it takes place. Sarah Wanenchuk, at Cyborglogy, highlights this quite well:
I think that Scarry and Bourke actually have it pretty much correct: the subtle dehumanizing effects of increasingly augmented warfare are not in the practice of the war-fighting itself but in the collection of official discourses that we construct around that warfare. We like to think of more highly technical warfare as cleaner, more controlled, less messy, less human – at least on our end. This kind of discourse is classically digital dualist; it assumes that the relationship between physical and digital – or between human and technological – is zero-sum in nature, and that less of one is necessarily more of the other. It rejects the notion that humanity and technology have been, are, and will be enmeshed, that the relationship between the two is complex and constantly evolving.
If we assume our weapons act of their own will, we’re free from thinking about the goal which that violence is supposed to serve. This is a debate we’ve been meaning to have for a long time, but so long as we’re distracted by the newness of the tech and we let science fiction stand in place of genuine understanding, we’ll continue to see wars fought without aims by weapons without controllers.
*Incidentally, this is something that has been done with air power for a long time:
Aerial bombardment of urban centers is credited with being a milestone in the history of direct civilian targeting. The guilt and psychological fallout that pilots in bombers feel has been correlated to the height at which they were flying at the point that they let their payload drop.