Raunchy robotics: the ethics of sexbots

You may have seen recent Kia Forte commercials featuring “Hotbots” — their version of a female sex robot commonly known as a fembot.1 In this ad, it is implied that the owner of the car is using the robot for sex. Kia used a human actress for the robot,2 but such advanced technology is not far off, and robots* for such uses already exist, ranging in complexity from inflatable dolls to realistic-looking and -feeling machines with built-in motion. They are especially popular in Japan and South Korea, where individuals rarely own them because they sell at 6-figure prices**, but rent-a-doll “escort” services make a profit by renting them out to customers.3 There is also a significant following among American men with “robot fetishes” who use fembots as sexual and/or romantic partners.4

Personally, I find the whole practice creepy. But as with any sexual practice, finding something aversive is not a reason to oppose others doing it. In my opinion, though, there are some justifiable reasons one might oppose (if not prohibit) sexbots:

They can promote unhealthy attitudes toward relationships. Those who market sex robots aim to produce, or at the very least encourage, desires for unbalanced relationships in which one has total control over one’s partner (particularly, as it stands now, where men have total control over manufactured women). Teaching people that sex with an unresponsive, undesiring partner is desirable could perpetuate rape culture. One male user raved that the dynamic was “as close to human slavery as you can get.”4

They could be considered prostitutes. South Korean authorities have debated whether rent-a-doll escort services, which were created to dodge prostitution laws, should also be illegal.3

They could encourage gender (and possibly any type of) stereotypes. Manufacturers already ascribe gendered characteristics to other types of robots.5 Sex robots open the door for the promotion of stereotypes of women as submissive fulfillers of men’s desires or “men as perpetually-ready sexual performance machines.”A current line of sex robots has different models representing some problematic stereotypes, such as Wild Wendy and Frigid Farrah (racializing as well as gendering robots).

They could promote misogyny. Fembots are significantly more popular than their malebot counterparts, and they possess traditional “feminine” qualities: submissiveness, adhesion to beauty ideals, and desire (if one could call it that) to please men without any desires of her own. Like the Stepford Wives, fembots allow men to fulfill patriarchal fantasies in which women exist solely to please them. One male user said “the whole idea of man taking charge” attracted him to fembots.4

They may replace real relationships and distance users from their partners or other potential partners. A plurality of Americans polled indicated that sex with a robot while in a relationship is cheating.5 Whether or not it is infidelity, robots designed to meet needs that humans cannot, such as “lack of complaints or constraints,”3 may give humans unrealistic standards to live up to in order to capture their partners’ attention. In addition, having sexual and possibly even romantic interactions that don’t require any effort may discourage users from meeting real people. In addition, people with partners who are “immune .. to their clients’ psychological inadequacies”3 may not address these inadequacies, making it even harder to form real relationships.

Users could develop unhealthy attachments to robots. Unless robots achieve consciousness (which merits a separate blog post), people could develop feelings for robots who don’t feel anything back. This may sound silly, but people already name and dress up their roombas. It is natural for people to develop attachments to sexual partners — whether this applies when the partner is not human remains to be seen — and some even hope for their sexbots to double as romantic partners.4 An asymmetrical relationship of this nature could end up unfulfilling, isolating, or depressing.

Whether sexbots are ethical is a separate issue from whether they should be legal. For countries and US states that ban sex toys altogether, the decision will be obvious. Countries that ban prostitution, like South Korea, may debate whether a machine can be considered a prostitute. Some laws banning sex toys in the US have been struck down on the grounds of privacy, and I suspect these same arguments will come up for sexbots. David Levy, a scholar on the topic, argues that sexbots are no more legally problematic than vibrators.3 But sexbots have raised and will further raise ethical, if not legal, issues specific to their goal of simulating human beings.

-Suzannah Weiss

* It is debatable whether the current generation of sex dolls can be considered robots, but they do seem to be headed in that direction.

**This information may be outdated, as there are currently sex robots on the market for 4-figure prices.

1 “Kia Chalks up Another Ad as a Sexist Fail | About-Face.” About-Face. Accessed June 18, 2013. http://www.about-face.org/kia-chalks-up-another-ad-as-a-sexist-fail/.

2 “2014 Kia Forte Super Bowl Ad Features Sexy Robots | Edmunds.com.” Edmunds. Accessed June 18, 2013. http://www.edmunds.com/car-news/2014-kia-forte-super-bowl-ad-features-sexy-robots-disrespectful-reporter.html.

3 Levy, David. “The Ethics of Robot Prostitutes.” Robot ethics: the ethical and social implications of robotics. Ed. Lin, Patrick. Cambridge, Mass.: MIT Press, 2012. Print.

4 “Discovery Health ‘Sex Robot Roxy: Sex Robot’.” Discovery Fit and Health. Accessed June 18, 2013. http://health.discovery.com/tv-shows/specials/videos/sex-robot-sex-robot.htm.

5 “Robot Sex Poll Reveals Americans’ Attitudes About Robotic Lovers, Servants, Soldiers.” Huffington Post, April 10, 2013. http://www.huffingtonpost.com/2013/04/10/robot-sex-poll-americans-robotic-lovers-servants-soldiers_n_3037918.html.

6. Brod, Harry. “Pornography and the alienation of male sexuality.” Social Theory and Practice 14.3 (1988): 265-84.

The Dangers of Drones

The concern about the growing impersonality of warfare is an old one. But now more than ever, critics worry about the distance from which war is conducted. The increasingly prevalent drones on war-grounds can save the lives of the soldiers they replace, but they also present new safety and ethical issues.

Drones are prone to targeting errors, which have already caused hundreds of civilian deaths by the US in countries such as Pakistan and Afghanistan, in addition to thousands at the hands of the Taliban.1 It is hard to say whether this surpasses human error — the statistics on drones are not comparable to those on human error because humans and drones undergo different types of commissions. But it is at the very least disconcerting that the nature of drone piloting can lead to miscommunications among correspondents and misjudgments of who is being targeted, as was the case in a 2010 attack on Afghan civilians resulting from disagreements among a Predator pilot, the crew, and field commanders over the presence of children and of Taliban members (it turned out there were children in the targeted area but not Taliban members).2 To minimize unwanted damage by drones, the Department of Defense dictates that operations involving drones must proceed only under fully approved human supervision in accordance with laws of war.3

The use of drones also poses risks to military personnel. A hyperbolic illustration of the psychological effects of killing from a distance can be found in the new season of Arrested Development, in which the mentally challenged Buster thinks he is playing a video game for training purposes when he is actually acting as a drone pilot. Even those informed about the situation may still feel as if they are playing a video game, viewing their duties less gravely than they would in person, until the force of their impact hits them in hindsight. PTSD is as common among drone pilots as among other air force members, but can arise for different reasons. Though more distant, the pilots are in some ways more connected to their target zones as a result of their constant exposure through the computer screen. “They witness the carnage,” said Lin Otto, an epidemiologist who studied PTSD in drone pilots. In a 2011 survey, about half of drone pilots reported high stress levels, citing “long hours and frequent shift changes.” Other stressors they face include “witnessing combat violence on live video feeds, working in isolation or under inflexible shift hours, juggling the simultaneous demands of home life with combat operations and dealing with intense stress because of crew shortages.” The Pentagon has created a new medal for drone pilots and extended psychological and religious counseling services to them.4

For an interesting presentation of moral dilemmas related to robotic warfare, see this Ted Talk by political scientist P.W. Singer.

-Suzannah Weiss

1.“Report: US Drone Attacks Rapidly Increasing in Afghanistan (Wired UK).” Wired UK. Accessed June 14, 2013. http://www.wired.co.uk/news/archive/2013-02/20/un-afghanistan-drone-deaths.

2. Drew, Christopher. “Study Cites Drone Crew in Attack on Afghans.” The New York Times, September 10, 2010, sec. World / Asia Pacific. http://www.nytimes.com/2010/09/11/world/asia/11drone.html.

3. “A Roadmap for U.S. Robotics: From Internet to Robotics.” 2013 Edition. March 20, 2013. http://robotics-vo.us/sites/default/files/2013%20Robotics%20Roadmap-rs.pdf

4. Dao, James. “Drone Pilots Found to Get Stress Disorders Much as Those in Combat Do.” The New York Times, February 22, 2013, sec. U.S. http://www.nytimes.com/2013/02/23/us/drone-pilots-found-to-get-stress-disorders-much-as-those-in-combat-do.html.