Back to AI Intersections Database

Predictive AI reinforces stereotypes of poor girls and women, stripping autonomy.


AI Impact(s)

Bias and discrimination

Microsoft Azure created the machine learning platform for Plataforma Técnologica de Intervención Social in Salta, Argentina, which used bad data (biased, incomplete) for an anti-rights agenda. According to the commissioning party's narratives, if the Ministry has enough information from poor families, conservative public policies can be deployed to predict and avoid abortions by poor women. The public's persistent belief in the infallibility of mathematically-derived algorithms presents another challenge to growing digital rights movements.

Share Page

Want to suggest an update to the AI Intersections Database?

Help us make this database a robust resource for movement building! We welcome additions, corrections, and updates to actors, issues, AI impacts, justice areas, contact info, and more.

Contribute to the database

Sign Up for News and Updates

Join our Insights email list to stay up-to-date in the fight for a better internet for all.

AI Intersections Database

This website supports Web Monetization

Read more about our implementation

Mozilla is a global non-profit dedicated to putting you in control of your online experience and shaping the future of the web for the public good. Visit us at Most content available under a Creative Commons license.