Back to AI Intersections Database

AI development in sex robots has the potential for manipulation and coercion.

Issue

Justice Area(s)

Gender Justice

AI Impact(s)

Sexualization

Location(s)

Global

Year

2018

"AI development in sex robots requires immediate policy attention, presents special risks and opportunities to develop manipulative/coercive AI. Advanced machine learning may allow robots to cultivate love and devotion, the ability to elicit personal information or to manipulate and influence behavior. These capabilities are all theoretically possible, and perhaps more importantly, they are profitable for AI sex robots to cultivate. With the worldwide sex technology reportedly worth 30 billion USD (Kleeman, 2017), the market may incentivize the development of AI capabilities that may be vastly more consequential than blinking silicon sex dolls." Quoted from "Sex Robots — A Harbinger for Emerging AI Risk." See also: "Sexbots: The Ethical Ramifications of Social Robotics' Dark Side."

Share Page

Related Issues by Justice Area

Issue 2017

Predictive AI can mistakenly out LGBTQIA+ people.

A Stanford University study demonstrated an AI model that categorizes people as gay or straight based on their faces. It quickly sparked a backlash from LGBTQIA+ rights activists who point out that this technology could be employed to intentionally out and harm queer people.

Gender Justice
Issue 2020

Ignoring richness and complexity of gender identities.

The Transgender Rights Movement pushes back against technological systems that force people into gender binaries. From Facebook's "sex" field to AI systems like Genderify that force definitions of people into male / female binaries by their name, face, or other traits, machine decision making systems have routinely ignored the more complicated realities of being human when they don't fit neatly into poorly conceived computer systems.

Gender Justice
Issue 2024

Some schools in the UK have installed AI-driven software that “actively listens” to students in the bathrooms.

Administrators say they want to use this software to address vaping, bullying, and other misbehavior in the bathrooms. Phrases that can trigger assistance include "help me" and "stop it." One of the companies behind the sensors, Triton, argues that it seeks to “provide an additional layer of security against threats like bullying or sexual assault in these areas, reinforcing a safe school environment … to enhance safety, not monitor everyday conversations.” But digital advocates say that “secretly monitoring school bathrooms is a gross violation of children’s privacy and would make pupils and parents deeply uncomfortable." This tech could be used to discriminate against students based on gender as well.

Community Health and Collective Security Gender Justice Human Rights
Issue 2021

LGBTIQA+ hate and intolerance are accelerating in digital spaces for queer Africans.

In 2021, Access Now partnered with organizations in Ghana, Kenya, and Uganda to track legislation that accelerates "the digital repression of LGBTQ+ rights in Africa."

Gender Justice

Want to suggest an update to the AI Intersections Database?

Help us make this database a robust resource for movement building! We welcome additions, corrections, and updates to actors, issues, AI impacts, justice areas, contact info, and more.

Contribute to the database

Sign Up for News and Updates

Join our Insights email list to stay up-to-date in the fight for a better internet for all.

AI Intersections Database

This website supports Web Monetization

Read more about our implementation

Mozilla is a global non-profit dedicated to putting you in control of your online experience and shaping the future of the web for the public good. Visit us at foundation.mozilla.org. Most content available under a Creative Commons license.