Back to AI Intersections Database

LGBTIQA+ hate and intolerance are accelerating in digital spaces for queer Africans.

Issue

Justice Area(s)

Gender Justice

AI Impact(s)

LGBTIQA+ Discrimination

Location(s)

Africa

Ghana

Kenya

Uganda

Year

2021

In 2021, Access Now partnered with organizations in Ghana, Kenya, and Uganda to track legislation that accelerates "the digital repression of LGBTQ+ rights in Africa."

Share Page

Related Issues by Justice Area

Issue 2014

Amazon's AI recruitment tool discriminated against women.

The online giant created its own recruitment tool to help evaluate the resumes of people who for software developer positions. It was soon discovered that the algorithm was poorly ranking applications from people it assumed were women. This is because it was trained on a dataset that consisted primarily of men’s resumes that were submitted to the company over the past decade. The system was taught to penalized any resume that included the word “women’s” or listed that the candidate graduated from a couple of specific women’s-only universities. Use of the software has been discontinued.

Economic Justice Gender Justice
Issue 2019

Deepfakes overwhelmingly target people who identify as women.

A 2019 study from cybersecurity company DeepTrace found that 96 of the AI-powered deepfake videos circulating on the internet feature pornographic content. Much of it depicts women in the nude or performing sex acts without their consent. While this technology is frequently used to denigrate celebrities like rapper Megan Thee Stallion or singer Taylor Swift, but it’s also used to manipulate and coerce everyday people.

Gender Justice
Issue 2024

Some schools in the UK have installed AI-driven software that “actively listens” to students in the bathrooms.

Administrators say they want to use this software to address vaping, bullying, and other misbehavior in the bathrooms. Phrases that can trigger assistance include "help me" and "stop it." One of the companies behind the sensors, Triton, argues that it seeks to “provide an additional layer of security against threats like bullying or sexual assault in these areas, reinforcing a safe school environment … to enhance safety, not monitor everyday conversations.” But digital advocates say that “secretly monitoring school bathrooms is a gross violation of children’s privacy and would make pupils and parents deeply uncomfortable." This tech could be used to discriminate against students based on gender as well.

Community Health and Collective Security Gender Justice Human Rights
Issue 2023

Companies are rushing to integrate LLMs into products even though the output of these tools cannot be fully controlled.

This can be seen in the Gemini image generation debate. "A system that you cannot debug through a logical, Socratic process is a vulnerability that exploitative tech tycoons will use to do what they always do, undermine the vulnerable."

Community Health and Collective Security Economic Justice Environmental Justice Gender Justice Racial Justice

Want to suggest an update to the AI Intersections Database?

Help us make this database a robust resource for movement building! We welcome additions, corrections, and updates to actors, issues, AI impacts, justice areas, contact info, and more.

Contribute to the database

Sign Up for News and Updates

Join our Insights email list to stay up-to-date in the fight for a better internet for all.

AI Intersections Database

This website supports Web Monetization

Read more about our implementation

Mozilla is a global non-profit dedicated to putting you in control of your online experience and shaping the future of the web for the public good. Visit us at foundation.mozilla.org. Most content available under a Creative Commons license.