Predictive AI can mistakenly out LGBTQIA+ people.
A Stanford University study demonstrated an AI model that categorizes people as gay or straight based on their faces. It quickly sparked a backlash from LGBTQIA+ rights activists who point out that this technology could be employed to intentionally out and harm queer people.
Related Issues by Justice Area
Amazon's AI recruitment tool discriminated against women.
The online giant created its own recruitment tool to help evaluate the resumes of people who for software developer positions. It was soon discovered that the algorithm was poorly ranking applications from people it assumed were women. This is because it was trained on a dataset that consisted primarily of men’s resumes that were submitted to the company over the past decade. The system was taught to penalized any resume that included the word “women’s” or listed that the candidate graduated from a couple of specific women’s-only universities. Use of the software has been discontinued.
Economic Justice Gender Justice Issue 2019Deepfakes overwhelmingly target people who identify as women.
A 2019 study from cybersecurity company DeepTrace found that 96 of the AI-powered deepfake videos circulating on the internet feature pornographic content. Much of it depicts women in the nude or performing sex acts without their consent. While this technology is frequently used to denigrate celebrities like rapper Megan Thee Stallion or singer Taylor Swift, but it’s also used to manipulate and coerce everyday people.
Gender Justice Issue 2024Some schools in the UK have installed AI-driven software that “actively listens” to students in the bathrooms.
Administrators say they want to use this software to address vaping, bullying, and other misbehavior in the bathrooms. Phrases that can trigger assistance include "help me" and "stop it." One of the companies behind the sensors, Triton, argues that it seeks to “provide an additional layer of security against threats like bullying or sexual assault in these areas, reinforcing a safe school environment … to enhance safety, not monitor everyday conversations.” But digital advocates say that “secretly monitoring school bathrooms is a gross violation of children’s privacy and would make pupils and parents deeply uncomfortable." This tech could be used to discriminate against students based on gender as well.
Community Health and Collective Security Gender Justice Human Rights Issue 2021LGBTIQA+ hate and intolerance are accelerating in digital spaces for queer Africans.
In 2021, Access Now partnered with organizations in Ghana, Kenya, and Uganda to track legislation that accelerates "the digital repression of LGBTQ+ rights in Africa."
Gender Justice