Tempest M. Henning
Social media justice advocates strongly discourage tapping out epistemic resources of Black Indigenous People of Color (BIPOC). The barrage of questions, asking for more resources, or demanding that BIPOC individuals explain their lived experiences can be a drain on one’s mental and emotional health. Allies and critics are often directed to do their own research regarding colonially fueled racist biases. However, this paper will explore the problematic implications of ‘just google it’ not in terms of argumentative practices, but due to the construction of racially biased search engine algorithms. Search engines have recently been exposed for racist, especially misogynoiristic, results when searching for terms as seemingly mundane as ‘professional women’s hair styles,’ ‘clean kitchens,’ or even ‘angry women.’ Noble (2018) as well as D’Ignazio and Klein (2020) have uncovered several ways in which search engines (and the artificial intelligence used to construct these search engines) perpetuate racism, specifically against Black and Ingenious women and girls. What I seek to examine within this project are the argumentative implications of this AI mode of oppression, given that Google and other search engines have been touted as a way to relieve some of the epistemic burdens that we face.
It is not the stance of this project to propose racially marginalized peoples explain their oppression; rather, I aim to argue that society is so imbued with systemic oppression through the utilization of racially problematic algorithms, that directing individuals to Google or utilize other search engines for research purposes has the high risk of compounding the problem of racial oppression. So, what was conceived as a small liberatory measure for the racially marginalized is instead another tool that perpetuates oppression. Search engine results perpetuate oppressive narratives that reflect historically uneven distributions of power in society.
The paper proceeds as follows: first I give an account of search engines and the artificial intelligence systems that have masqueraded as big data problem solvers but is rather discrimination disguised as math. From here, I review the literature on epistemic exploitation (both academic and outside of the ivory tower) to not only give a genealogy of the phrase ‘just Google it,’ but to also highlight the power and liberatory potential that this phrase possesses. I briefly delve into argumentation literature that has debated on the issue of passing off the labor of giving reasons to one’s interlocutors. My objections to the ‘just Google it’ argumentative tactic are not because it is bad argumentation, but I object to it on the grounds of racist artificial intelligence systems. I conclude the paper with a few argumentative suggestions alongside a more active call for decolonized and anti-racists search engine algorithms. Until artificial intelligence used within search engines is not just opinions expressed in code, there must be some other mechanism in place to divert epistemic labor for BIPOC individuals.