Machine learning systems taught with historical data often cement existing inequalities into place.
Those who are most in need — of a loan, job, university acceptance, or housing, for example — could be the exact groups unfairly locked out based on inaccurate and biased training data. For example, in professional recruiting, matching algorithms that rely on large, unstructured databases risk generating biased profiles of candidates. Certain candidates on gig or recruiting platforms might receive limited employment opportunities or degraded working conditions, thereby perpetuating pre-existing injustices. Litigation and regulatory processes have been slow to reform such algorithmic labor platforms, making it difficult to prove a case and win remuneration against unfair treatment.