Humanoids for Human-olds: Sociable Service Robots Could Be the First Wave of the Robotics Revolution

As computer and mechanical engineering becomes more advanced, we are drawing ever closer to a revolution in robotics, and it’s coming just in time. By 2030, the populations of elderly people in the U.S., Japan, and Europe will explode. Worldwide, the number of people above the age of 80 will double. (1) Meanwhile life expectancy is getting higher every day. As fertility rates in these developed countries plunge below replacement rate, (2) the workforce will not be able to support the aging population without dramatic improvements in productivity. However, advances in the burgeoning field of medical robotics may offer a way for the elderly to sustain their health and independence indefinitely. Robots such as the inMotion ARM Interactive Robot™ are on the market today to provide sensory-motor therapy for victims of stroke who are re-learning how to move their limbs. It is not difficult to imagine sensing robots capable of detecting and recording vital signs, administering medicines, and assisting with the tasks of day-to-day living that become difficult at advanced ages. From this perspective, robots have several advantages even over human caregivers. They have more strength, more endurance, more consistency, and importantly, less pity.

However, the introduction of medical and service robots to the elderly presents its own set of challenges. This model deviates from standard paradigms of technology adoption, because the new technology will not be pitched at the younger, more adaptable generation. Because these robots might need to serve people relatively unfamiliar with computer interfaces and operating modes, they must be as understandable and intuitive as the human caregivers they could replace. These service robots must be able to interact socially with their patients.

Fortunately, the field of robotics is one that has long ensnared our narcissistic, anthropomorphic tendencies. One of the dreams of robotics engineers and researchers everywhere is to create an accurate emulation of a human. One such recently created lab devoted to this goal is GENI Lab. Founded in part by RISD alumnus David Hanson, GENI Lab states as its central goal “the creation of a life-sized humanoid robot featuring a realistic, emotional face and personality.“ (3) Hanson (who also took AI classes at Brown!) has gained fame in the robotics community for his daring and extraordinary attempts to overcome what is known in the industry as the Uncanny Valley. Coined in the 1970s, the term refers to the tendency of realistic robots and animations to suddenly become creepy as they approach a human appearance. With Hanson at the head, GENI Lab may be one of the first to achieve this Holy Grail of robotics.

However, creating humanistic robots requires more than just a pretty face. A good HRI relationship should result in emotions, engagement, and trust. The robots must be able to perceive and communicate intent, both verbally and nonverbally. Hanson’s heads might help with the latter requirement, but the former is an open problem. Robots have an unparalleled ability to fuse data streams from multiple sources, and this signature skill allows them to, through the use of Artificial Intelligence, combine various physiological data in order to interpret human emotions and actions. Imagine a robot capable of combining your medical history, current heart rate, previous interaction with your spouse, and what you had for breakfast in order to determine whether or not you are feeling angry – and then easing off on your prescribed exercise regimen for the day. The next big step in robotics is to create an all-purpose AI capable of making these inferential decisions, while simultaneously learning and interacting with humans in a dynamically changing environment.

The use of social service and medical robots to assist the elderly could provide a foothold for robots to gain more widespread acceptance. By filling an otherwise unmet need for full-time caregivers, these robots could form attachments with their patients – who would be able to remain independent longer – and with their patients’ families. By reinventing the public image of a robot as an emotive assistant, this first wave of humanized robot helpers could pave the way for robot assistance in the rest of our society.



1. “A Roadmap for U.S. Robotics: From Internet to Robotics.” 2013 Edition. March 20, 2013.


2. U.N. Economic and Social Council. “World Population to 2300.” 2004


3. “About.” GENI LAB, 2013. Web. 21 June 2013.

The Dangers of Drones

The concern about the growing impersonality of warfare is an old one. But now more than ever, critics worry about the distance from which war is conducted. The increasingly prevalent drones on war-grounds can save the lives of the soldiers they replace, but they also present new safety and ethical issues.

Drones are prone to targeting errors, which have already caused hundreds of civilian deaths by the US in countries such as Pakistan and Afghanistan, in addition to thousands at the hands of the Taliban.1 It is hard to say whether this surpasses human error — the statistics on drones are not comparable to those on human error because humans and drones undergo different types of commissions. But it is at the very least disconcerting that the nature of drone piloting can lead to miscommunications among correspondents and misjudgments of who is being targeted, as was the case in a 2010 attack on Afghan civilians resulting from disagreements among a Predator pilot, the crew, and field commanders over the presence of children and of Taliban members (it turned out there were children in the targeted area but not Taliban members).2 To minimize unwanted damage by drones, the Department of Defense dictates that operations involving drones must proceed only under fully approved human supervision in accordance with laws of war.3

The use of drones also poses risks to military personnel. A hyperbolic illustration of the psychological effects of killing from a distance can be found in the new season of Arrested Development, in which the mentally challenged Buster thinks he is playing a video game for training purposes when he is actually acting as a drone pilot. Even those informed about the situation may still feel as if they are playing a video game, viewing their duties less gravely than they would in person, until the force of their impact hits them in hindsight. PTSD is as common among drone pilots as among other air force members, but can arise for different reasons. Though more distant, the pilots are in some ways more connected to their target zones as a result of their constant exposure through the computer screen. “They witness the carnage,” said Lin Otto, an epidemiologist who studied PTSD in drone pilots. In a 2011 survey, about half of drone pilots reported high stress levels, citing “long hours and frequent shift changes.” Other stressors they face include “witnessing combat violence on live video feeds, working in isolation or under inflexible shift hours, juggling the simultaneous demands of home life with combat operations and dealing with intense stress because of crew shortages.” The Pentagon has created a new medal for drone pilots and extended psychological and religious counseling services to them.4

For an interesting presentation of moral dilemmas related to robotic warfare, see this Ted Talk by political scientist P.W. Singer.

-Suzannah Weiss

1.“Report: US Drone Attacks Rapidly Increasing in Afghanistan (Wired UK).” Wired UK. Accessed June 14, 2013.

2. Drew, Christopher. “Study Cites Drone Crew in Attack on Afghans.” The New York Times, September 10, 2010, sec. World / Asia Pacific.

3. “A Roadmap for U.S. Robotics: From Internet to Robotics.” 2013 Edition. March 20, 2013.

4. Dao, James. “Drone Pilots Found to Get Stress Disorders Much as Those in Combat Do.” The New York Times, February 22, 2013, sec. U.S.