Back to AI Intersections Database

AI opacity may affect societal issues like election and school determination.


AI Impact(s)

Opaque Systems Prevent Accountability

AI systems are commonly the opposite of transparent. Developers keep their inputs and methods secret due to competition, to prevent gaming of the algorithm (as with Google Search), and often, because they themselves aren't entirely sure how the AI comes up with outcomes. This is a concern for commercial products, but it's even more problematic when those products affect societal issues. For example, The Markup found that Gmail's black box algorithm controls which political emails arrive in subscribers' main inboxes. Different 2020 presidential candidates had starkly different rates of successful deliverability. The opacity of AI systems is also a troubling issue when the AI is employed by government to determine which school a child goes to, or whether someone qualifies for social benefits. Major resource allocation decisions like this need to be auditable and accountable, and secret algorithms don't meet those standards.

Share Page

Want to suggest an update to the AI Intersections Database?

Help us make this database a robust resource for movement building! We welcome additions, corrections, and updates to actors, issues, AI impacts, justice areas, contact info, and more.

Contribute to the database

Sign Up for News and Updates

Join our Insights email list to stay up-to-date in the fight for a better internet for all.

AI Intersections Database

This website supports Web Monetization

Read more about our implementation

Mozilla is a global non-profit dedicated to putting you in control of your online experience and shaping the future of the web for the public good. Visit us at Most content available under a Creative Commons license.