JANUARY 23, 2019
Dreams spring from reality and are realized in it.
— Ivan Chtcheglov (1953)
IN 2017, with an almost imperceptible ripple, a story broke out in the French media reporting on the bizarre case of a young Danish proto-fascist who had become a kind of “alt-right” data miner. The misguided racist was using information harvested from one of IAC’s biggest Match Group relationship websites, OkCupid, to support a fake-science argument in favor of an apparently new form of white supremacist eugenics. The story picked up little attention, and those who commented on it — in the progressive media at least — saw the attempt as a deliberate and gratuitous misuse of data to support a dangerous political agenda, which in some ways it was.
Although part of a wider rise in dangerous right-wing “race science,” which Jordan Peterson can be seen as the soft edge of, there is something particularly important about data drawn from the realm of love. In this case, over 70,000 profiles had been scraped in the attempt to show a so-called “cognitive dysgenics,” ultimately aiming to produce a data-validated hierarchy of humans (through various methods including correlating intelligence with religion) with white people at the top. Other related arguments that have been made via dating data include proof that users are attracted to those of the same race, which is used to further support the logic of dating those just like you (upon which eugenics relies).
The importance of OkCupid as the data source in all this is not merely coincidental. Dating sites offer an unprecedented sample size of detailed information for miners, but in this case they are also seen as the means through which our species meets with a view to procreation: the place where dangerous eugenicist ideas could even be put into practice.
As such, what seemed to go strangely unnoticed in the OkCupid story was a more significant and infinitely more frightening possibility. The question which threatened yet failed to arise was whether the data itself, and therefore the structure and algorithms of a major online dating platform (and perhaps of others, since Match Group owns dozens of sites which reach 190 countries in 42 languages), were in part responsible for the way the results had seemed to come out. The racial biases of dating sites have been commented on before, but only in terms of how they reflect or reveal existing social biases. Instead, might it be possible that there is a more complex ideological code written into the history of the connection between love and the digital world — a connection which was already biased in favor of what might even be seen as right-wing and patriarchal identarian politics. OkCupid itself has commented on the racial biases of their data, blaming the fact that “daters are no more open-minded than they used to be” and thereby circumventing the possibility that their own algorithms might be at least partially responsible. In fact, the logic of matching users only with those just like themselves might be something shared between the dating algorithms and this extreme right-wing rhetoric.
Far from ratifying bizarre alt-right pseudoscience, if this were shown to be the case, it would present the claims of the “scientific” alt-right as just part of a wider and more concerning digital architecture that has become structural to relationships themselves, propped up by liberal assumptions about data’s neutrality and innocence. In broader terms, the concern raised by the incident is that if relationships, desire, and friendships are mutating and transforming in the age of AI, big data, and social media, then there might be a problem with the politics they are mutating in the service of.
Ours is the data-driven age. Arguments and claims made in the media and in the academy are backed up with harvested “empirical” information drawn from data collection technologies that make dystopian cybernetic dreams seem like relics from the ancient past. Data sets control what we see on our internet searches, social media feeds, and television screens, yet that process of selection remains deliberately obscured. Data as a methodology percolates through every area of the university, and new appointments (even in the so-called “arts”) reward scholars who apply data analytics software to understand everything from gender to poetry. Mobile apps manage everything from eating habits to menstrual cycles using data-formatted algorithms, while “smart condoms” collect sexual movements into large aggregated sets which set a new blueprint for the sexual future.
Data now determines what is normal. It aggregates us into patterns and serves as the dominant way in which we understand the world around us. The case of the smart condom says it all: it can collect sexual data for comparison with that of others in order to “improve” sexual performance by directing it in line with the norm (or even toward bettering the norm at its own game and turning its users into some kind of data-ratified heteronormative stallions). But what does it mean for queer and non-normative sexual practices, for example, if new norms are created out of a data logic which by its structural nature rules non-conformist units anachronistic and out of time, determining them as “not part of” or “other to” its pleasure-logic?
Could such practices even be rendered anachronistic in the truest sense of the word (which contains the idea of backward in time), banished to the past or as belonging to the past by the normalizing drive of data society which eradicates anachronisms from its data set? Are we perhaps in danger of entering a new iteration of Foucault’s “great confinement” in which great swathes of data automate our very desires, bringing them into aggregating patterns which threaten to reduce “madnesses” to digital silence? Could such technologies be banishing misfit “data points” from the “data sets” that are used to construct the applications that go on to become the technological norms through which we govern and organize our lives?
If any of these suggestions prove even partially true, it would take us into a future in which our “artificial limbs” (to quote Freud), the technologies that have become inseparable from consciousness itself, were constructed only on the basis of the so-called “typical,” with all its misogyny, racial prejudice, and heteronormativity inherited by the technologies themselves. It was Henri Lefebvre who wrote that “computerized daily life risks assuming a form that certain ideologues find interesting and seductive.” One need only think of the racist facial recognition software deployed last year, which was not designed to be such by its creators but which inherited bias from its data. Dating algorithms may similarly assume certain ideological traits.
Data claims to show us what is typical, but it also constructs the typical and makes it visible to us in a flash of understanding, where what is perceived appears to have been waiting patiently for our “visualization” to make plain. The very language of data, then, codes it as a mode of seeing and perceiving reality. We could playfully put this in the Kantian terms of “transcendental schematism,” which describes the process of translating an empty universal concept into something that relates “factually” to our everyday lives. With all data, an abstract concept of what it might show predates the visualization, after which it appears to relate to our existence in a factual way. Using the idea in a way somewhat removed from the sphere of relationships and sexuality, Slavoj Žižek writes that “it is at this level that ideological battles are won and lost — the moment we perceive [something] as ‘typical’ […] the perspective changes radically.” “The universal acquires concrete existence when some particular content begins to function as its stand-in,” he argues, and data can be seen as precisely such a stand-in for the search for the universal, ultimate truth, operating ideologically to make us believe. Even if we doubt the data, it is already too late, because the typical has come into being, caught in the gap between doubting what we read and already having read it.
In the realm of love, this is a particularly vital point, and the smart condoms already mentioned are the tip of a very large iceberg. We don’t need to see the code designed by IAC’s technology to know that the logic is one of reflection or mirroring, connecting people on the basis of similarity and likeness (data has even “proven” that people are attracted to those who look similar). The language of the “match” is almost enough to make the point on its own. When Chris McKinlay hacked OkCupid to circumvent the logic of matching him only with those with shared interests, it was received as a charming tale of finding love outside of the algorithm, but was treated as a quirky anachronism rather than as something that called the whole logic of the site into question. Had McKinlay not been strictly heterosexual, his act would have shown that the data-driven approach for what it is. Such matching is only doing for relationships what Facebook does for friendships, connecting those considered likely to like each other (and also to like each other) on the basis of correspondence (in the original sense of the word). We act as if we would like to meet and converse with only those who are like us, because it has already become the typical way to relate to others.
What is important here is that this is not, as it might appear, the logic of culture nor of nature (whatever those terms are taken to mean) but of data. It is not sufficient to say that it is because we are a narcissistic society, for example, that our culture produces apps and websites that match us with those who are like us. It is also not because of an imaginary natural human drive that the data reflects racial separatism, the argument used in neo-fascist interpretations of such data. It is data whose laws our society and our connections, friendships, and lovers now obey. Data is neither culture nor nature, and it does not reflect or reveal the truth of either. Instead, it is its own force, propelling us toward a continuation of the existing relationship between people and things, since it can only agree with the pattern and exclude the anomalous. Further, data can not only feed norms and continue them but produce new norms that have yet to be visualized as norms until the helpful data apparently assists us in the process. Data establishes and then extends or proliferates the typical, while also making it appear to have always-already been there.
At the same time as data-driven relationships become the norm, technology infiltrates sexuality in other comparable ways that at first appear subcultural, niche, or non-normative. Often we give cultural or natural explanations for such bizarre occurrences, but they are also the product of data as well as of culture. Sex robots and Virtual Reality relationships, for instance, reflect misogynistic and patriarchal traditions of domination and subservience (the cultural explanation), or else they gratify the infinite libido of the hungry sexual animal that is the virile male (the natural explanation).
In fact, both politically motivated interpretations or explanations of data-driven projects (left=cultural, right=natural) seem to neglect the role of data as a third force between culture and nature that operates additionally as an active agent in the process. The robots (and even dolls like those at LumiDolls in Moscow or PlayMate Dolls in Toronto) are constructed in relation to data patterns of what has been deemed to reflect the market. If Japanese men are our main audience and they statistically prefer younger Western women, that’s what is built, or so the reasoning goes. Such is also the logic of Silicon Valley liberals who defend their work with the claim that their technology is neither political nor patriarchal but an innocent and apolitical result of data that indicates what people want, echoing OkCupid’s circumvention of the role of their own algorithm discussed above. On the contrary, data does not simply show what is already there in nature or in culture.
Instead, such data-oriented developments do more than reflect what is already desired. For one thing, they code desire differently, presenting particular instances of desire as typical or universal and constructing desire itself (including that of those who deviate) in relation to those data-established norms. For another, they iron out inconsistencies or remove elements coded as unwanted or unpleasurable. The logic of data-driven dating sites, as well as sex robots like those designed by Realbotix and virtual relationships like those in Summer Lesson (now available on the PlayStation Store in the United States and United Kingdom) is not only to give the user what they want but also to exclude what is unwanted in the other. With Realbotix products, the user can even “customize” to remove any unwanted features of their sex robot and choose what “personality traits” they want the avatar to have (not unlike an opening Plenty of Fish questionnaire). Thus, the process of giving the user what they desire is constructed as uniquely personalized when, in fact, it is deeply aggregated and based on excluding diversity in the sphere of relationships (from both each single relationship and from the wider data set) rather than proliferating it.
In 1953, this ability of modern technology to transform desire while claiming only to respond to it was anticipated by the important situationist Ivan Chtcheglov, a close ally and collaborator of Guy Debord. In his seminal essay “Formulary for a New Urbanism,” Chtcheglov pointed out that new technologies idealize classically desirable situations (things already desired — such as gazing at the stars, watching the rain, et cetera) but exclude the “unpleasant” within the older relationship to those pleasures: “The latest technological developments would make possible the individual’s unbroken contact with cosmic reality while eliminating its disagreeable aspects. Stars and rain can be seen through glass ceilings. The mobile house turns with the sun.”
As such, he argues, the population has entered a kind of bored malaise in which our desires have become more homogenous, predictable, and automated by a process of cleaning or sanitization. I’ve argued elsewhere that situationist ideas of “psychogeography” might explain more broadly how we are organized at the level of desire by mobile technologies (rather than architecture) today, and here we can see a more specific way in which the situationist attention to a revolution in desire reveals a key organizing feature of modern capitalism. “Dreams spring from reality and are realized by it,” writes Chtcheglov, arguing that technological changes do not reflect existing desires but construct the future of desire. If our dreams become reality, we need to be even more suspicious of the data-driven patterns discussed here, which do not merely reflect what we want but take us into the future of desire. On the other hand, there is also a space for hope, and Chtcheglov argues that “[i]t has become essential to provoke a complete spiritual transformation by bringing to light forgotten desires and by creating entirely new ones. And by carrying out an intensive propaganda in favor of these desires.”
What is needed, then, is to take control over the process, asking what kinds of desires we might want to save from being banished into anachronism and which new forms of desire we might need to construct. The excuse of simply following the data provides a means for powerful actors to circumvent their involvement in such questions. Since data at the very least amplifies and extends normative trends (banishing anomalous points to anachronism) and at least potentially establishes new norms and patterns, data-driven projects are as political as they come.
Last week, a New York–based cult announced plans to open another sex brothel in West Hollywood, only this time with a difference: it will be a queer and non-conformist attempt to subvert what have already become “traditional” human-robot relations (those the data has encouraged). Some of their practices might strike most as bizarre, but at least this logic acknowledges that the robots are implicated in political patterns of desire and might even play a role in shaping such things. In the future of love technologies, for couples and groups of all sexualities, data-driven approaches that obscure the powerful politics of their logic should be replaced by systems that admit that their algorithms are implicated in politics — sexual and otherwise.
Alfie Bown teaches Media Arts at Royal Holloway University London and has written several books on digital media and philosophy with Zero, Polity and Bloomsbury, as well as journalism for the Guardian and the Paris Review.