Back to AI Intersections Database

In June 2020, the north Indian state of Haryana began using a new algorithmic system, called the Family Identity Data Repository or the Parivar Pehchan Patra (PPP) database, to determine eligibility for people requesting financial assistance because they are unable to earn enough to survive for various reasons.

Issue

Justice Area(s)

Disability Justice

Economic Justice

AI Impact(s)

Economic Harm and Inequity

Opaque Systems Prevent Accountability

Poor quality data

Location(s)

Asia

India

Year

2020

It works by pulling demographic and socioeconomic information for families via several linked government databases, using data on birth, death, marriage, employment, property ownership, income tax payments and more. But it is incorrectly marking residents as dead and blocking them from much-needed funding. Meanwhile, human workers don't have the power to override bad decisions made by the system. A 2023 government study on the system found that "it stopped the pensions of 277,115 elderly citizens and 52,479 widows in a span of three years because they were 'dead.' However, several thousands of these beneficiaries were actually alive and had been wrongfully declared dead either due to incorrect data fed into the PPP database or wrong predictions made by the algorithm.”

Share Page

Related Issues by Justice Area

Issue 2014

Amazon's AI recruitment tool discriminated against women.

The online giant created its own recruitment tool to help evaluate the resumes of people who for software developer positions. It was soon discovered that the algorithm was poorly ranking applications from people it assumed were women. This is because it was trained on a dataset that consisted primarily of men’s resumes that were submitted to the company over the past decade. The system was taught to penalized any resume that included the word “women’s” or listed that the candidate graduated from a couple of specific women’s-only universities. Use of the software has been discontinued.

Economic Justice Gender Justice
Issue 2013

Michigan’s Unemployment Insurance Agency's Michigan Integrated Data Automated System incorrectly flagged about 40,000 people as committing unemployment fraud during the 2013–2015 period.

The state worked to fix the problems caused by the AI system, but the damage has major: the wrongly accused Michigan residents have been subjected to denied unemployment benefits, fines, repossessions, and even bankruptcy.

Economic Justice
Issue 2023

AI is poised to exacerbate the Black-white wealth gap in the United States.

The median wealth amassed by Black households in the United States is just 15 percent of what that of white households. That means that Black families have about $44,900 USD to their $285,000 USD. This is the result of many systemic factors that stretch all the way back to the the time of chattel slavery. The McKinsey Institute for Black Economic Mobility predicts that AI will add about $7 trillion USD to the global economy each year, with nearly $2 trillion USD of that concentrated in the U.S. But it also warns that if generative AI technology development continues on its current trajectory, it will widen that wealth gap. The prediction is that by 2045, it will grow by a whopping $43 billion USD every year.  

Economic Justice Racial Justice
Issue 2023

AI could be used to better meet the needs of the disabled, but there are currently many instances where it actively works against the disabled community.

In 2023, researchers at Pennsylvania State University published “Automated Ableism: An Exploration of Explicit Disability Biases in Artificial Intelligence as a Service (AIaaS) Sentiment and Toxicity Analysis Models,” which explores the bias embedded in several natural language processing (NLP) algorithms and models. They found that every single public model they tested “exhibited significant bias against disability,” classifying sentences as negative and toxic simply because they contained references to disability, ignoring context and the actual lived experiences of disabled people.

Community Health and Collective Security Disability Justice

Want to suggest an update to the AI Intersections Database?

Help us make this database a robust resource for movement building! We welcome additions, corrections, and updates to actors, issues, AI impacts, justice areas, contact info, and more.

Contribute to the database

Sign Up for News and Updates

Join our Insights email list to stay up-to-date in the fight for a better internet for all.

AI Intersections Database

This website supports Web Monetization

Read more about our implementation

Mozilla is a global non-profit dedicated to putting you in control of your online experience and shaping the future of the web for the public good. Visit us at foundation.mozilla.org. Most content available under a Creative Commons license.