Back to AI Intersections Database

Big tech companies like Google and Microsoft are almost certainly using the personal data of people who use their online services to build, train, and improve their LLMs.

Issue

Justice Area(s)

Economic Justice

AI Impact(s)

Consumer Rights Violation

Privacy Violation

Location(s)

Global

Year

2023

If you use Gemini, Google says it can use your conversations with the chatbot, along with other sensitive personal data like your location, to improve its AI products. Google also stores any personal data it uses to improve its AI outside of its users’ control — they can’t review it or delete it, and have no way of knowing what personal content Google’s human reviewers might be seeing. Meanwhile, Microsoft refuses to disclose whether it does the same; the company's services agreement and other key documentation are extremely unclear. This is extremely problematic, as Google and Microsoft rush to incorporate LLMs into their online services, including web browsers and desktops, with a potential impact on the privacy of the tens of millions of people who use their services.

Share Page

Related Issues by Justice Area

Issue 2023

Medicare Advantage insurance plans use AI to determine what care it will cover for its 31 million elderly subscribers in the United States.

Journalists at Stat found that companies are specifically using AI systems to deny coverage for care. The massive problem: the algorithms are a black box that can’t be peered into, making it nearly impossible for patients to fight for health care coverage when they don’t know why they were denied it in the first place.

Community Health and Collective Security Disability Justice Economic Justice
Issue 2024

AI hiring algorithms come complete with dangerous bias.

About 70 percent of companies (and 99 percent of Fortune 500 companies) around the world use AI-powered software to make hiring decisions and track employee productivity. The problem? The tools work by identifying and replicating patterns around who was previously hired, which means they perpetuate the bias embedded in the system, locking marginalized populations out of employment. This is particularly tough for disabled people, people of color, and disabled people of color, who are often subject to employment discrimination.

Disability Justice Economic Justice
Issue 2020

In June 2020, the north Indian state of Haryana began using a new algorithmic system, called the Family Identity Data Repository or the Parivar Pehchan Patra (PPP) database, to determine eligibility for people requesting financial assistance because they are unable to earn enough to survive for various reasons.

It works by pulling demographic and socioeconomic information for families via several linked government databases, using data on birth, death, marriage, employment, property ownership, income tax payments and more. But it is incorrectly marking residents as dead and blocking them from much-needed funding. Meanwhile, human workers don't have the power to override bad decisions made by the system. A 2023 government study on the system found that "it stopped the pensions of 277,115 elderly citizens and 52,479 widows in a span of three years because they were 'dead.' However, several thousands of these beneficiaries were actually alive and had been wrongfully declared dead either due to incorrect data fed into the PPP database or wrong predictions made by the algorithm.”

Disability Justice Economic Justice
Issue 2023

The release of large language models (LLM) to the public in 2023 reinvigorated a debate about the use of copyrighted data for model training and questions of appropriate credit and compensation.

Copyright concerns centered on appropriating creators' work and not compensating them for work used in training models, which might generate art that competes with theirs and threatens their livelihoods. Meanwhile, some argue that AI could boost creativity. The debate about the appropriate way to pay and credit creators for derivatives of their work centers the interpretation of fair use. Three artists filed a class-action copyright lawsuit against Stable Diffusion and Midjourney, as did Getty.

Economic Justice

Want to suggest an update to the AI Intersections Database?

Help us make this database a robust resource for movement building! We welcome additions, corrections, and updates to actors, issues, AI impacts, justice areas, contact info, and more.

Contribute to the database

Sign Up for News and Updates

Join our Insights email list to stay up-to-date in the fight for a better internet for all.

AI Intersections Database

This website supports Web Monetization

Read more about our implementation

Mozilla is a global non-profit dedicated to putting you in control of your online experience and shaping the future of the web for the public good. Visit us at foundation.mozilla.org. Most content available under a Creative Commons license.