Who the Computer Sees: Computing Culture, Computing Bias

Carla Fehr

Dr. Joy Buolamwini, MIT computer scientist and self-described ‘poet of code,’ is the founder of the Algorithmic Justice League, an organization committed to addressing bias in machine learning (2016). Buolamwini, a dark-skinned Black woman, found it necessary to wear a white Halloween mask to successfully interact with the commercially available facial recognition systems she studied (Hardesty 2018). She was consequently moved to test the ability of IBM’s, Microsoft’s, and Face++’s facial recognition systems to determine the gender of people with a range of skin tones. Although these systems successfully identified White men, boasting error rates of only 0.8 percent, when they turned their gaze to Black women, error rates rose to 20 percent, and for dark-skinned Black women jumped as high as 46 percent—barely better than a coin toss. How could systems marketed by some of the world’s leading technology companies be so profoundly flawed?

           The systematic exclusion of Black women from computer science schools and the technology workforce provides insight into this problem. In the US, Black women receive only three percent of computing degrees. While Black women comprise 17 percent of the women who do earn computer science degrees, they only comprise seven percent of women employed as computer scientists (McAlear et al. 2018). In Canada, Black people comprise 2.6 percent of the technology workforce, and Black women participate at lower rates than Black men (Vu et al. 2019).

It is not a coincidence that the computer scientist who noticed facial recognition software’s failure to identify Black women and was motivated to address this problem is a dark-skinned Black woman. In fact, this is precisely what feminist philosophers of science such as Helen Longino (2002) and Carla Fehr (2006; 2011), who explore the epistemic benefits of diversity, would predict, and aligns with what has since been labelled the business case for diversity.

In addition to extending arguments about the epistemic benefits of diversity to AI, I argue for the necessity of developing anti-racist and anti-sexist cultures within computer science, the technology workforce, as well as in broader social contexts. The failure of facial recognition software to recognize Black women parallels the observations of scholars such as Lorde, Collins, Lugones, and Ortega about how women of colour are ignored, denied uptake, and sometimes, literally, unseen by White feminists. For example, Maria Lugones (1987) writes that US Anglo women “ignore us, ostracize us, render us invisible, stereotype us, leave us completely alone, interpret us as crazy.” I use Buolamwini’s research to show that these facial recognition systems reproduce and extend the erasure of Black women (Buolamwini and Gebru 2018). When Buolamwini wore her white mask, she could interact with early facial recognition systems. Her mask obscured the systems’ biases—they could see her when they could see her as White. The ‘obscuring mask’ is a metaphor for how the epistemic violence of being unseen or unheard (Dotson 2011, 2014) undermines a community’s ability to recognize and ameliorate harmful biases. The epistemic benefits of diversity depend on cultures that respect diverse practitioners and give their identity-laden perspectives uptake.

Work cited

Buolamwini, Joy. 2016. “The Algorithmic Justice League: Unmasking Bias.” MIT Media Lab. https://medium.com/mit-media-lab/the-algorithmic-justice-league-3cc4131c5148

Buolamwini, Joy and Timnit Gebru. 2018. “Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification.” Proceedings of Machine Learning Research 81:1–15.

Dotson, Kristie. 2011. “Tracking Epistemic Violence, Tracking Practices of Silencing.” Hypatia 26(2), 236–257.

Dotson, Kristie. 2014. “Conceptualizing Epistemic Oppression.” Social Epistemology, 28(2), 115-138.

Fehr, Carla. 2007. “Are smart men smarter than smart women? The epistemology of ignorance, women and the production of knowledge.” The ‘Woman Question’ and Higher Education: Perspectives on Gender and Knowledge Production in America, Ann Mari May, ed., Edward Elgar: Northampton, MA and Cheltenham, UK, pp. 102-116.

Fehr, Carla. 2011. “What is in it for me? The benefits of diversity in scientific communities.” Feminist Epistemology and Philosophy of Science: Power in Knowledge, ed. Heidi Grasswick, Dordrecht: Springer, pp 133-155.

Hardesty, Larry. 2018. “Study Finds Gender and Skin-type Bias in Commercial Artificial Intelligence Systems.” MIT News Office. http://news.mit.edu/2018/study-finds-gender-skin-type-bias-artificial-intelligence-systems-0212

Longino, Helen. 2002. The Fate of Knowledge. Princeton University Press: New Jersey.

Lugones María. 1987. “Playfulness, “World”-Travelling, and Loving Perception.” Hypatia 2 (2):3-19 (1987).

McAlear, F., Scott, A., Scott, K., and Weiss, S. 2018. “Data Brief: Women of Color in Computing.” Kapor Center/ASU CGEST.

Vu, Viet, Creig Lamb, and Asher Zafar. 2019. “Who are Canada’s Tech Workers?” Brookfield Institute.

%d bloggers like this: