Back to AI Intersections Database

AI might be deployed to help in the fight for social change.

Issue

Justice Area(s)

Community Health and Collective Security

Disability Justice

Economic Justice

Environmental Justice

Gender Justice

Human Rights

Racial Justice

AI Impact(s)

Public Interest Tech

Location(s)

Global

Year

2024

Social movements don't just suffer from AI's harms, they also employ it toward their own missions. Public interest tech AI projects leverage emerging tech to boost their work on existing challenges, including in the areas of Racial Justice, Gender Justice, Environmental Justice, Economic Justice, and Community Health and Collective Security.

Share Page

Related Issues by Justice Area

Issue 2017

Predictive AI can mistakenly out LGBTQIA+ people.

A Stanford University study demonstrated an AI model that categorizes people as gay or straight based on their faces. It quickly sparked a backlash from LGBTQIA+ rights activists who point out that this technology could be employed to intentionally out and harm queer people.

Gender Justice
Issue 2020

Ignoring richness and complexity of gender identities.

The Transgender Rights Movement pushes back against technological systems that force people into gender binaries. From Facebook's "sex" field to AI systems like Genderify that force definitions of people into male / female binaries by their name, face, or other traits, machine decision making systems have routinely ignored the more complicated realities of being human when they don't fit neatly into poorly conceived computer systems.

Gender Justice
Issue 2018

Discriminatory targeted ads focusing on housing, jobs, and credit persist at Facebook, even after changes.

Facebook was allowing advertisers to exclude certain groups when advertising within federally regulated markets like housing and jobs. Facebook often sidestepped the concerns using technicalities, such as classifying users not by race but by so-called"ethnic" or "multicultural" affinities, or self-selecting identity groups consistent with what a majority of Black, Latine, or other minoritized people might statistically like on the platform. Only after rigorous coverage from media outlets, most prominently ProPublica, did Facebook eventually disable ad targeting options for housing, job, and credit ads as part of a legal settlement with civil rights groups (National Fair Housing Alliance, American Civil Liberties Union, Communication Workers of America). However, even when Facebook changed its ad platform to prevent advertisers from selecting attributes like "ethnic affinity," it was determined that the platform still enabled discrimination by allowing advertisers to target users through proxy attributes.

Racial Justice
Issue 2017

AI systems reflect the culture's bias against the disabled.

The Allegheny County Department of Human Services in the state of Pennsylvania in the United States uses an AI system that residents allege incorrectly flags disabled parents as being neglectful, removing their children from their homes with no actual evidence of neglect. It is currently under investigation by the United States Department of Justice.

Community Health and Collective Security Disability Justice Human Rights

Want to suggest an update to the AI Intersections Database?

Help us make this database a robust resource for movement building! We welcome additions, corrections, and updates to actors, issues, AI impacts, justice areas, contact info, and more.

Contribute to the database

Sign Up for News and Updates

Join our Insights email list to stay up-to-date in the fight for a better internet for all.

AI Intersections Database

This website supports Web Monetization

Read more about our implementation

Mozilla is a global non-profit dedicated to putting you in control of your online experience and shaping the future of the web for the public good. Visit us at foundation.mozilla.org. Most content available under a Creative Commons license.