Some schools in the UK have installed AI-driven software that “actively listens” to students in the bathrooms.
Administrators say they want to use this software to address vaping, bullying, and other misbehavior in the bathrooms. Phrases that can trigger assistance include "help me" and "stop it." One of the companies behind the sensors, Triton, argues that it seeks to “provide an additional layer of security against threats like bullying or sexual assault in these areas, reinforcing a safe school environment … to enhance safety, not monitor everyday conversations.” But digital advocates say that “secretly monitoring school bathrooms is a gross violation of children’s privacy and would make pupils and parents deeply uncomfortable." This tech could be used to discriminate against students based on gender as well.
Related Issues by Justice Area
AI could be used to better meet the needs of the disabled, but there are currently many instances where it actively works against the disabled community.
In 2023, researchers at Pennsylvania State University published “Automated Ableism: An Exploration of Explicit Disability Biases in Artificial Intelligence as a Service (AIaaS) Sentiment and Toxicity Analysis Models,” which explores the bias embedded in several natural language processing (NLP) algorithms and models. They found that every single public model they tested “exhibited significant bias against disability,” classifying sentences as negative and toxic simply because they contained references to disability, ignoring context and the actual lived experiences of disabled people.
Community Health and Collective Security Disability Justice Issue 2023AI models, without ecological awareness, can perpetuate and amplify environmentally damaging narratives, exacerbating ecological crises.
The integration—or lack thereof—of ecological awareness in AI systems manifests significantly in how AI influences public and private sector decisions. For instance, without ecological consideration, AI-driven recommendations in urban planning and resource management could prioritize economic gains over sustainability, leading to increased carbon footprints and depletion of natural resources. The H4rmony Project addresses this by embedding ecolinguistic principles into AI to ensure its outputs promote sustainability.
Environmental Justice Human Rights Issue 2017AI systems reflect the culture's bias against the disabled.
The Allegheny County Department of Human Services in the state of Pennsylvania in the United States uses an AI system that residents allege incorrectly flags disabled parents as being neglectful, removing their children from their homes with no actual evidence of neglect. It is currently under investigation by the United States Department of Justice.
Community Health and Collective Security Disability Justice Human Rights Issue 2023Medicare Advantage insurance plans use AI to determine what care it will cover for its 31 million elderly subscribers in the United States.
Journalists at Stat found that companies are specifically using AI systems to deny coverage for care. The massive problem: the algorithms are a black box that can’t be peered into, making it nearly impossible for patients to fight for health care coverage when they don’t know why they were denied it in the first place.
Community Health and Collective Security Disability Justice Economic Justice