The Dangers of Drones

The concern about the growing impersonality of warfare is an old one. But now more than ever, critics worry about the distance from which war is conducted. The increasingly prevalent drones on war-grounds can save the lives of the soldiers they replace, but they also present new safety and ethical issues.

Drones are prone to targeting errors, which have already caused hundreds of civilian deaths by the US in countries such as Pakistan and Afghanistan, in addition to thousands at the hands of the Taliban.1 It is hard to say whether this surpasses human error — the statistics on drones are not comparable to those on human error because humans and drones undergo different types of commissions. But it is at the very least disconcerting that the nature of drone piloting can lead to miscommunications among correspondents and misjudgments of who is being targeted, as was the case in a 2010 attack on Afghan civilians resulting from disagreements among a Predator pilot, the crew, and field commanders over the presence of children and of Taliban members (it turned out there were children in the targeted area but not Taliban members).2 To minimize unwanted damage by drones, the Department of Defense dictates that operations involving drones must proceed only under fully approved human supervision in accordance with laws of war.3

The use of drones also poses risks to military personnel. A hyperbolic illustration of the psychological effects of killing from a distance can be found in the new season of Arrested Development, in which the mentally challenged Buster thinks he is playing a video game for training purposes when he is actually acting as a drone pilot. Even those informed about the situation may still feel as if they are playing a video game, viewing their duties less gravely than they would in person, until the force of their impact hits them in hindsight. PTSD is as common among drone pilots as among other air force members, but can arise for different reasons. Though more distant, the pilots are in some ways more connected to their target zones as a result of their constant exposure through the computer screen. “They witness the carnage,” said Lin Otto, an epidemiologist who studied PTSD in drone pilots. In a 2011 survey, about half of drone pilots reported high stress levels, citing “long hours and frequent shift changes.” Other stressors they face include “witnessing combat violence on live video feeds, working in isolation or under inflexible shift hours, juggling the simultaneous demands of home life with combat operations and dealing with intense stress because of crew shortages.” The Pentagon has created a new medal for drone pilots and extended psychological and religious counseling services to them.4

For an interesting presentation of moral dilemmas related to robotic warfare, see this Ted Talk by political scientist P.W. Singer.

-Suzannah Weiss

1.“Report: US Drone Attacks Rapidly Increasing in Afghanistan (Wired UK).” Wired UK. Accessed June 14, 2013. http://www.wired.co.uk/news/archive/2013-02/20/un-afghanistan-drone-deaths.

2. Drew, Christopher. “Study Cites Drone Crew in Attack on Afghans.” The New York Times, September 10, 2010, sec. World / Asia Pacific. http://www.nytimes.com/2010/09/11/world/asia/11drone.html.

3. “A Roadmap for U.S. Robotics: From Internet to Robotics.” 2013 Edition. March 20, 2013. http://robotics-vo.us/sites/default/files/2013%20Robotics%20Roadmap-rs.pdf

4. Dao, James. “Drone Pilots Found to Get Stress Disorders Much as Those in Combat Do.” The New York Times, February 22, 2013, sec. U.S. http://www.nytimes.com/2013/02/23/us/drone-pilots-found-to-get-stress-disorders-much-as-those-in-combat-do.html.

One thought on “The Dangers of Drones

  • Nice summary. Another relevant example from fiction is the book Ender’s Game, which I will not summarize for fear of introducing spoilers.

    I think it’s also important to point out the difference between current drones, which are teleoperated, and the kinds of autonomous drones we can envision. Autonomous drones would need careful programming/training so that their decisions are consistent with ethical (?) combat. A challenge is that existing systems can’t really assess situations according to human standards. People have a difficult time distinguishing innocent and nefarious behavior and machine vision is even less well calibrated.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s