The sins of the parents are to be laid upon the children: biased humans, biased data, biased models

Published in Perspectives in Psychological Sciences, 2023

Technological innovations have become a key driver of societal advancements. Nowhere is this more evident than in the field of machine learning (ML), which has developed algorithmic models that shape our decisions, behaviors, and outcomes. These tools have widespread use, in part, because they can synthesize massive amounts of data to make seemingly objective recommendations. Yet, in the past few years, the ML community has been raising the alarm on why we should be cautious in interpreting and using these models: they are created by humans, from data generated by humans, whose psychology allows for various biases that impact how the models are developed, trained, tested and interpreted. As psychologists, we thus face a fork in the road; Down the first path, we can continue to use these models without examining and addressing these critical flaws, and rely on computer scientists to try to mitigate them. Down the second path, we can turn our expertise in bias towards this growing field, collaborating with computer scientists to mitigate the deleterious outcomes associated with these models. This paper serves to light the way down the second path by identifying how extant psychological research can help examine and mitigate bias in ML models.

Download paper here

Recommended citation: Osborne, Merrick, Ali Omrani, and Morteza Dehghani. “The sins of the parents are to be laid upon the children: biased humans, biased data, biased models.” (2022).