Who Will Drive the Revolution?


“I will remember that artificially intelligent machines are for the benefit of humanity and will strive to contribute to the human race through my creations.”

Lee McCauley’s 2007 proposed a Hippocratic oath for roboticists

The field of robot ethics often focuses on ethical problems with creating robots. But because certain populations are in need of the extra care, protection, or labor that robots provide, it can be unethical not to create robots. The US Congress’s Robotics Caucus Advisory Committee’s 2013 Roadmap for US Robotics documents how robots can improve and even save lives. At work, they can “assist people with dirty, dull, and dangerous tasks;” at home, they can offer “domestic support to improve quality of life;” and in warzones, “robots have already proven their value in removing first-responders and soldiers from immediate danger.” The US government’s National Robotics Initiative (NRI) hopes to encourage robotics in order to boost American manufacturing, space exploration, medical discoveries, and food safety.

Many roboticists share the vision of using robots to solve some major problems the world currently faces. For example, an area receiving particular attention among researchers is elderly care. The number of people over 80 worldwide is expected to double by 2030, leading to increased demands for living assistance. Smart homes and robot caretakers can help these people stay independent longer (1). Even in military use, robots are not necessarily killing machines. iRobot CEO Colin Angle said that his PackBot has saved the lives of EOD technicians by hunting IEDs in Afghanistan and Iraq. Furthermore, drones have proven applicable beyond the military: Association of Unmanned Vehicles International CEO Michael Toscano estimated in July 2013 that agriculture would overtake defense as the most common use for drones in the next 10 years, noting that this advancement will help feed an ever-growing population.

Those who fund and use robots can ensure that robots’ potential to do good for the world does not go to waste. An active approach may be necessary because research may otherwise be driven simply by potential profits.


“A profitable market for expensive new technology is not of itself sufficient moral defense for allowing widespread sales.” -Whitby (2012)

The first ethical question in the development of robots is what kinds of robots to make. Are we creating the kinds of robots that our society needs? The answer to this question depends on the motivations not only of researchers but also of their sponsors. The main sources for robotics funding are the government, businesses, and research institutions.

Government funding

The government has played a particularly significant role in robotics funding since 2011, with the advent of the NRI, whose mission is to fund research that facilitates human-robot interactions. Its 2011 and 2012 solicitations were subtitled “The realization of co-robots acting in direct support of individuals and groups” and stated that proposals would be selected by a peer-review process. The National Science Foundation listed the recipients of $30 million of the $50 million awarded for this initiative (the destination of the other $20 million is unclear). The plurality of the projects listed (12 out of 31) named medicine as an application for its robots, including aid in rehabilitation and surgery, followed by domestic care — either general service robots or those geared toward the elderly or incapacitated — and manufacturing, such as industrial robots for use in factories (2).

As part of the NRI, the Defense Advanced Research Projects Agency (DARPA)’s FY2012 solicitation for the Defense University Research Instrumentation Program (DURIP) encouraged proposals for robotics research equipment applicable the military (Army Research Office). Of the 190 recipients of DURIP grants, which totaled about $55 million, 22 projects involved robots, autonomous machines or vehicles, or artificial intelligence. Many robotics-related projects were not directly or solely applicable to the military, including 3 grants for general human-robot interaction research and 7 for ocean surveillance and research. Others objects of study included the response of miniature air drones to wind and water-splitting catalysts for autonomous energy generation.

DARPA will account for nearly half of the government’s Research and Development funding under Obama’s FY2014 plan. Though this suggests that research applicable to the military may have priority, it does not necessarily mean that the robotics projects DARPA sponsors are primarily for killing. For example, the agency is currently sponsoring a $2 million “Robotics Challenge” in which robots compete in a series of tasks simulating natural and man-made disasters. The press release cites robots that diffuse explosives as an example of the type of innovations DARPA is looking for. The Robotics Challenge has sponsored robots made by universities, companies, and government organizations such as NASA.

However, government funding of robotics has its critics, perhaps the most outspoken being Oklahoma Senator Tom Coburn. Coburn criticized a $1.2 million NSF grant to a laundry-folding robot at UC Berkeley and a $6000 grant for a “Robot Rodeo and Hoedown” at a computer science education symposium to introduce teachers to robotics (3). A researcher in the UC Berkeley group countered that the laundry-folding algorithm was a big step for robotics more broadly, and the organizers of the robot rodeo event explained that they were trying to get more students interested in computer science.

Venture Capital Funding

Venture Capital funding is also a major source of funding for robotics. In 2012, the 22 most funded robotics companies accrued almost $200 million total from VC funds (4). Like the NRI funds, the plurality went toward medicine — about $84 million of the money awarded by VC funds went toward companies that create robots for surgery (e.g. MedRobotics and Mazor Robotics), bionics (e.g. iWalk), and hospital service (e.g. Aethon). Another $30 million went to Rethink Robotics for the manufacturing robot Baxter.

Some have criticized excessive funding for entertainment robots, such as a robotic toy car largely funded by Andreessen Horowitz. The car’s maker, the new start-up Anki, raised $50 million total in 2012 and 2013. Anki’s founders believe this investment is justified because their technology can lead to more advanced and affordable robots down the line.

Research institution funding

Universities themselves sometimes fund robotics research, but these funds are often funneled from the sources above — in the United States, particularly the government. The Robotics Institute at Carnegie Mellon, the first and largest robotics department at any US university, lists its biggest sponsors as the Department of Defense, DARPA, the National Aeronautics and Space Administration, the National Institutes of Health, and the NSF. The Georgia Tech Center for Robotics and Intelligent Machines and the MIT Media Lab get sponsorship from companies in the robotics industry in exchange for developing their products. This model appears to be the most common one in Japan, China, and South Korea, according to a WTEC report. Google and other companies have grants specifically for funding technological research, along with private foundations such as the Keck Foundation and the Alfred P. Sloan Foundation.

While grants are usually given for specific projects, some departments retain control over which projects their funds go toward. The Robotics Institute at Carnegie Mellon has a budget of $65 million per year, which it has used to fund 30 start-up companies in addition to supporting its own institution. Carnegie Mellon also, along with companies like Google and Yahoo and private donors, sponsors a foundation called TechBridgeWorld that has aided technological progress in developing areas of the world by funding innovations such as an “automated tutor” to improve literacy. This exemplifies how academics and researchers can use their knowledge of what people need and how to most efficiently meet those needs to influence production.


Because robotics companies, like any company, strive to make money, the purchase of robots also affects which robots get made. The International Federation of Robotics estimated that 2.5 million personal and domestic robots (e.g., those that do household chores or assist people with medical problems) and about 16,000 professional robots for use in various workplaces were sold worldwide in 2011. The professional robots consisted mostly of military (40%), and agricultural (31%) robots, but also of robots for other uses like medicine (6%) and manufacturing (13%). The Robotics Industries Association estimated during July 2013 that 230,000 robots were currently in use in United States factories, most often in the automotive industry.

Fifty years ago, we couldn’t imagine computers as part of our everyday lives. Now, we can’t imagine our lives without them. Robots are the new computers, in that they are capable of revolutionizing our society and economy. Whether this revolution is for better or for worse is up to all the aforementioned players.

-Suzannah Weiss

1 Whitby, B. (2012). Do You Want a Robot Lover? The Ethics of Caring Technologies. Robot ethics: the ethical and social implications of robotics. Ed. Lin, Patrick. Cambridge, Mass.: MIT Press, 2012.
2 National Science Foundation (2012). Press Release 12-166: Nearly $50 Million in Research Funding Awarded by NSF-Led National Robotics Initiative to Develop Next-Generation Robotics. Retrieved from http://www.nsf.gov/news/news_summ.jsp?cntn_id=125390.
3 Coburn, T., Oklahoma Senator (2011). The National Science Foundation: Under the Microscope.
4 Deyle, T. and Kandan, S. Venture Capital (VC) Funding for Robotics in 2012. Hizook: Robotics News for Academics and Professionals. http://www.hizook.com/blog/2013/06/10/venture-capital-vc-funding-robotics-2012


The Dangers of Drones

The concern about the growing impersonality of warfare is an old one. But now more than ever, critics worry about the distance from which war is conducted. The increasingly prevalent drones on war-grounds can save the lives of the soldiers they replace, but they also present new safety and ethical issues.

Drones are prone to targeting errors, which have already caused hundreds of civilian deaths by the US in countries such as Pakistan and Afghanistan, in addition to thousands at the hands of the Taliban.1 It is hard to say whether this surpasses human error — the statistics on drones are not comparable to those on human error because humans and drones undergo different types of commissions. But it is at the very least disconcerting that the nature of drone piloting can lead to miscommunications among correspondents and misjudgments of who is being targeted, as was the case in a 2010 attack on Afghan civilians resulting from disagreements among a Predator pilot, the crew, and field commanders over the presence of children and of Taliban members (it turned out there were children in the targeted area but not Taliban members).2 To minimize unwanted damage by drones, the Department of Defense dictates that operations involving drones must proceed only under fully approved human supervision in accordance with laws of war.3

The use of drones also poses risks to military personnel. A hyperbolic illustration of the psychological effects of killing from a distance can be found in the new season of Arrested Development, in which the mentally challenged Buster thinks he is playing a video game for training purposes when he is actually acting as a drone pilot. Even those informed about the situation may still feel as if they are playing a video game, viewing their duties less gravely than they would in person, until the force of their impact hits them in hindsight. PTSD is as common among drone pilots as among other air force members, but can arise for different reasons. Though more distant, the pilots are in some ways more connected to their target zones as a result of their constant exposure through the computer screen. “They witness the carnage,” said Lin Otto, an epidemiologist who studied PTSD in drone pilots. In a 2011 survey, about half of drone pilots reported high stress levels, citing “long hours and frequent shift changes.” Other stressors they face include “witnessing combat violence on live video feeds, working in isolation or under inflexible shift hours, juggling the simultaneous demands of home life with combat operations and dealing with intense stress because of crew shortages.” The Pentagon has created a new medal for drone pilots and extended psychological and religious counseling services to them.4

For an interesting presentation of moral dilemmas related to robotic warfare, see this Ted Talk by political scientist P.W. Singer.

-Suzannah Weiss

1.“Report: US Drone Attacks Rapidly Increasing in Afghanistan (Wired UK).” Wired UK. Accessed June 14, 2013. http://www.wired.co.uk/news/archive/2013-02/20/un-afghanistan-drone-deaths.

2. Drew, Christopher. “Study Cites Drone Crew in Attack on Afghans.” The New York Times, September 10, 2010, sec. World / Asia Pacific. http://www.nytimes.com/2010/09/11/world/asia/11drone.html.

3. “A Roadmap for U.S. Robotics: From Internet to Robotics.” 2013 Edition. March 20, 2013. http://robotics-vo.us/sites/default/files/2013%20Robotics%20Roadmap-rs.pdf

4. Dao, James. “Drone Pilots Found to Get Stress Disorders Much as Those in Combat Do.” The New York Times, February 22, 2013, sec. U.S. http://www.nytimes.com/2013/02/23/us/drone-pilots-found-to-get-stress-disorders-much-as-those-in-combat-do.html.