AI models, without ecological awareness, can perpetuate and amplify environmentally damaging narratives, exacerbating ecological crises.
The integration—or lack thereof—of ecological awareness in AI systems manifests significantly in how AI influences public and private sector decisions. For instance, without ecological consideration, AI-driven recommendations in urban planning and resource management could prioritize economic gains over sustainability, leading to increased carbon footprints and depletion of natural resources. The H4rmony Project addresses this by embedding ecolinguistic principles into AI to ensure its outputs promote sustainability.
Related Issues by Justice Area
AI systems reflect the culture's bias against the disabled.
The Allegheny County Department of Human Services in the state of Pennsylvania in the United States uses an AI system that residents allege incorrectly flags disabled parents as being neglectful, removing their children from their homes with no actual evidence of neglect. It is currently under investigation by the United States Department of Justice.
Community Health and Collective Security Disability Justice Human Rights Issue 2022Multilingual inconsistencies in AI systems impact linguistic hegemony.
AI can create language hegemony, where some languages are granted superior status and others are deemed inferior. Studies show that "language modeling bias can result in systems that, while being precise regarding languages and cultures of dominant powers, are limited in the expression of socio-culturally relevant notions of other communities." In extreme cases, this can even violate people's right to practice and preserve their native non-English languages.
Human Rights Racial Justice Issue 2023The European Commission has budgeted €47m to build an “automated border surveillance system” at Greece’s borders with North Macedonia and Albania.
It's modeled on a system already in use at the land border with Turkey. The reported aim of the new automated border surveillance system is to stop people from leaving Greece. "Many people holding Syrian, Afghan, Somalian, Bangladeshi or Pakistani passports seeking asylum in the European Union move out of Greece when they have the feeling that their administrative situation will not improve there," heading through northern Greece on route to North Macedonia and Albania, per AlgorithmWatch. Studies on similar surveillance systems shows that when migrants know they will be surveilled, they opt for longer, more dangerous routes to prevent being detected. “People are dying outside the visual range of the towers,” said one expert.
Human Rights Issue 2024Some schools in the UK have installed AI-driven software that “actively listens” to students in the bathrooms.
Administrators say they want to use this software to address vaping, bullying, and other misbehavior in the bathrooms. Phrases that can trigger assistance include "help me" and "stop it." One of the companies behind the sensors, Triton, argues that it seeks to “provide an additional layer of security against threats like bullying or sexual assault in these areas, reinforcing a safe school environment … to enhance safety, not monitor everyday conversations.” But digital advocates say that “secretly monitoring school bathrooms is a gross violation of children’s privacy and would make pupils and parents deeply uncomfortable." This tech could be used to discriminate against students based on gender as well.
Community Health and Collective Security Gender Justice Human Rights