Back to AI Intersections Database

Discrimination related to "edge cases."

Issue

AI Impact(s)

Bias and Discrimination

AI design choices that leave a margin of error and allow so-called "edge cases" might endanger and discriminate against the communities representing those cases. In 2020, Facebook’s automated content moderation system flagged posts from Nigerian activists protesting the Special Anti-Robbery Squad (SARS), a controversial police agency that activists say routinely carry out extrajudicial killings against young Nigerians, because the acronym “SARS” was listed by Facebook’s algorithm to be misinformation about the COVID-19 virus.

Share Page

Want to suggest an update to the AI Intersections Database?

Help us make this database a robust resource for movement building! We welcome additions, corrections, and updates to actors, issues, AI impacts, justice areas, contact info, and more.

Contribute to the database

Sign Up for News and Updates

Join our Insights email list to stay up-to-date in the fight for a better internet for all.

AI Intersections Database

This website supports Web Monetization

Read more about our implementation

Mozilla is a global non-profit dedicated to putting you in control of your online experience and shaping the future of the web for the public good. Visit us at foundation.mozilla.org. Most content available under a Creative Commons license.