Algorithmic Intimacy

Susan Castro

Discussions of the ethics and social justice of deep learning, data analytics, and smart technologies often represent privacy as a core concern. I argue in this paper that the current rhetoric of data privacy omits and obscures the nature and depth of the wrongfulness of some data use practices.

Consider the common practice of mining user-generated data to improve or personalize products. If the product is a video adventure game, then collection, analytics, and sale of user data without genuine informed consent is a breach of privacy. If the product is a smart vibrator, the data is both private and intimate (Reynolds). Given that sexuality-shaming is prevalent and causally intertwined with violence in a variety of ways, the rhetoric of privacy and transparency fails to capture the psychology, the stakes, and the nature of the ethical problem. The intimacy of data use practice is quite ubiquitous, e.g. the intimacy of general use smart devices like Alexa is considerable, though masked.

One reason for the inadequacy of privacy arguments for intimate contexts, I argue, is that the conceptual framework employed in these contexts is a legalistic, civil rights framework that obscures vulnerability. If persons are presumed to be independent, autonomous, (quasi-Kantian) individuals who have the power to freely consent or withdraw without risk to their identity, integrity, or well-being, then it may seem morally adequate to formally disclose policy and obtain pro forma consent via contracts of adhesion to transfer data property and use rights. If we instead acknowledge the real vulnerability of most persons to sexual, racial, or class-based violence and other harms, then data practices that impose additional risk of trolling, doxing, adverse employment decisions, eviction, etc., must be treated as high risk (Benjamin, Eubanks, Noble, Castro). These high-risk practices entail a fiduciary duty that may be better understood through an ethics of care framework than through obvious alternatives like liberal feminism or critical race studies. In the first part of the paper I sketch why intimacy is a necessary parameter in some easily identifiable contexts and how the inclusion of this parameter reveals that increasing transparency and requiring formal consent cannot wholly solve the core ethical problem that the privacy discourse aims to solve.

Second, I briefly consider the implications of recognizing that relationships between humans and artificial intelligences (AI) are on course to be intimate in an important sense (Lauer, Borenstein and Arkin, Devlin, Young, Turkle, Jang and Lee), and that AI will problematically have come to be through the exploitation of user data that includes intimately unethical access. To wit, the development of healthy and well human persons requires considerable intimacy, as we are social beings who need to learn even basic self-care through the help of others. The development of healthy and well AI seems to require considerable intimacy as well, at least their generating algorithms require intimate ‘knowledge’ of our data. If the algorithmic intimacy of personalization and AI development remains voyeuristic, the development of AI faces a tainted origins problem.

%d bloggers like this: