Discrimination related to "edge cases."
AI design choices that leave a margin of error and allow so-called "edge cases" might endanger and discriminate against the communities representing those cases. In 2020, Facebook’s automated content moderation system flagged posts from Nigerian activists protesting the Special Anti-Robbery Squad (SARS), a controversial police agency that activists say routinely carry out extrajudicial killings against young Nigerians, because the acronym “SARS” was listed by Facebook’s algorithm to be misinformation about the COVID-19 virus.