Biased or incomplete training datasets can result in racial discrimination manifesting via search engine results that perpetuate stereotypes and marginalization.
Safiya Umoja Noble has written about how searches for the term "professional hairstyles" in Google returned images of white, blonde women, whereas "unprofessional hairstyles" returned images of Black women. AlgorithmWatch has written about machine learning systems at Google and Instagram that consistently assigned negative and harmful labels to images of people with dark skin. For example, the system correctly labels a handheld thermometer when it's held by a light-skinned hand. When a dark-skinned hand holds the same thermometer, the system "sees" a gun.