Computation de Texte

By James Edward DraneyNovember 25, 2017

Computation de Texte

Plain Text by Dennis Tenen

WHEN DOES TECHNOLOGY cease to seem like magic, and become more like a dark art? As our dependency on new media grows, our fascination wanes, and a collective anxiety begins to take hold. In mass culture, it’s easy to spot the symptoms: high-tech horror shows like Black Mirror or the emergence of “digital detox” centers for the email addicted executives of Silicon Valley. Even up here, in the cloistered realm of academic literary studies, a nascent technophobia is emerging, disguised as a crisis (or resurgent interest) in humanism. Take Edward Mendelson, who, when called to write about new media for The New York Review of Books in 2016, chose to resurrect Virginia Woolf’s “serious joke” that on or around December 1910, human character changed. For Mendelson, our technological revolution (which he dates about a century after Woolf’s, in 2010, when everyone took up a smartphone) was obscuring, or at least altering, human nature. “Every technological revolution coincides with changes in what it means to be a human being,” he writes, “in the kinds of psychological borders that divide the inner life from the world outside.”

In his piece, Mendelson gives us an eloquent — if highly familiar — account of our communications society and its political pitfalls: mass surveillance, a disappearing inner life, increasing social anomie, and so on. But he — and most other humanists — tend to miss the far more consequential shift in human relations that is happening right before our very eyes, namely, that our relationship to writing is changing. This, after all, was what Woolf’s “serious joke” about human character was actually about. Woolf’s essay on Mr. Bennett and Mrs. Brown was not about history, politics, or the character of one’s cook. It was about writing — all the effort of the novelists and poets to analyze, arrest, and make beautiful the series of chaotic accelerations that we now call modernism.

Little did Woolf know that human character would change once again — not in 2010, as Mendelson has it, but a mere 47 years after Woolf’s diagnosis (her essay appeared in 1924). This second change was far subtler, some would even say banal, almost imperceptible, and it was diagnosed not by a novelist but by a critic. By 1971, the metaphysical niceties on which Woolf based her half-serious assessment had imploded. Writing — that most consequential form of human communication — gave way to something else, according to media theorist Friedrich Kittler. And unlike in 1910, human character didn’t merely change. It was obliterated.

The last historical act of writing may well have been the moment when, in the early seventies, the Intel engineers laid out some dozen square meters of blueprint paper (64 square meters in the case of the later 8086) in order to design the hardware architecture of their first integrated microprocessor.


According to Kittler, writing, under the dominance of new silicon-based technologies, became hidden from view, never to return to human perception. With his usual blend of gleeful cynicism, Kittler noted that “we simply do not know what our writing does.” With the advent of the microprocessor, humans no longer left metaphysical traces on paper — they simply rearranged lines of code on a word processor. The production of language, once seen as the very essence of human activity, became relegated the automatic process of a machine. And under this new technological regime, human character began to disappear, like a face drawn in sand at the edge of a silicon sea.

“Finished, it’s finished, nearly finished, it must be nearly finished,” says Beckett’s Clov. Perhaps we can say the same thing of writing: nearly finished, but not quite. Kittler’s cries were meant as a warning shot, rather than a prediction. Thankfully, the critic clarified himself, however subtly, at the end of his magisterial study of 19th-century analog technology, Gramophone, Film, Typewriter. Writing (or at least literature) was not dead yet. It simply had nothing more to say:

[Literature] ends in cryptograms that defy interpretation and only permit interception. Of all long-distance connections on this planet today. […] 0.1 percent flow though the transmission, storage and decoding machines of the National Security Agency (NSA) […] and hence the end of history, like nothing else. Automatic discourse analysis has taken command.


For Kittler, the end of metaphysical traces — writing — meant the end of literary study and hermeneutics. With the advent of the microprocessor, what we used to call writing becomes nothing more than information processing: “Trenches, flashes of lighting, stars — storage, transmission, the laying of cables.”

Dennis Tenen’s Plain Text: The Poetics of Computation is written against Kittler’s bleak assertion. Though like any good retort, it takes the notion that “we simply do not know what our writing does” with the utmost seriousness. With this gnomic pronouncement, Kittler meant that we’ve lost sight of the material contexts of our knowledge production. And this is exactly what Tenen argues, with brilliant clarity, in the introduction to his new monograph. Our new media have become like “a stage disappear[ing] from view”: “Our challenge today is to uproot ourselves from the comforts that rapidly descend on the dwellings of our intellectual life.” If literary scholars have any hope of understanding human character, we have to proceed through writing technologies themselves: in our case, through keyboard, copper, and silicon, to liquid crystal, and the floating gate.

Tenen, a literary critic who, like Kittler, also trained in computer programming, understands what is at stake in our evolving forms of reading. We are far too comfortable among our devices: smart phones, smart toasters, smart refrigerators, PDFs, ebooks, and fitbits. We have traded “critical understanding for comfort.” These devices, rather than vehicles or tools for reading and writing, have become inscription surfaces. They are black boxes, opaque, which we accept into our homes as the Trojans accepted their horse. As a result, what used to be solitary activities have become public in ways we cannot always perceive. Tenen aims to reposition us readers in a world where the word has moved from the page to the wires. “Where do words reside?” he asks. This is the question at the heart of Plain Text.

¤


The trick, for Tenen, is to avoid Kittler’s tendency of mistaking things — computers — for animate actors. Kittler afforded technology too much agency. Rather than stand in awe of our devices, we must stare them down. But on the other hand, to Kittler’s credit, our machines are not merely inanimate. For the computer is a different kind of device, unlike a car or a spoon or a hammer, in that it is epistemic. “[Computers] do not just get us from point A to point B; they augment thought itself, therefore transforming what it means to be human.” Unlike Kittler, Tenen — a self-proclaimed humanist — believes that we can take hold of our technological environment.

Oddly enough, the key tool for rewiring our relationship to computation (and the various inscription devices that we’ve installed in our lives) is not politics or cultural theory, but good old-fashioned literary criticism. This is where Plain Text begins to diverge from the litany of other books about our massive breach of privacy, such as Bernard Harcourt’s Exposed and Frank Pasquale’s The Black Box Society. For Tenen, the techniques of literary analysis are far more effective as political tools than any banal, agit-prop list of digital privacy breaches. For our panoptic condition is hardly a secret. Mass surveillance is common knowledge in our post-Snowden world, and yet we have yet to banish our devices en masse. In the age of ubiquitous computing, users need to adopt what Tenen calls a “systematic minimalisism.” Hence his title: plain text is not just a file format — it is a mode of relation, an interpretive stance that makes sure we understand what our own writing is doing.

And Tenen might be right: literary criticism could be just the ticket for our high technological age. Our writing may have disappeared from view, but that doesn’t mean that we’re not living in a frenzy of inscription. Contrary to Kittler’s diagnosis, we are always writing, whether we know it or not. A life story can be written with a series of clicks. A digital pacemaker keeps the heart beating, but it also stores and transmits its data, inscribing our personal information in some distant database, half a world away. In short, Alexa is always listening. Mallarmé’s contention that everything in the world exists to end up in a book has never made as much sense as it does in our digital-archival age. If the world has become a book, it makes sense to argue that a literary scholar should be the one to try and save it.

Take the problem of metaphor. “Structures of digital control often advance by metaphoric substation,” writes Tenen. But metaphor, in the case of the computer, obscures more than it reveals. We’re fed these alien control systems with the promise of familiarity: we “flip pages” in ebooks, we place “files” in “folders” on our “desktops.” But such metaphoric forms of organization serve to obscure as much as they facilitate ease. In order to disrupt the nefarious ease with which we adapt to our machines, Tenen turns to the Russian Formalists. Tenen, like the formalist critic Viktor Shklovsky, argues that we have become far too automated in our experience. This is why Shklovsky’s notion of estrangement proves to be an important technique for our digital age. What we need is radical defamiliarization, and such is the project outlined in Plain Text. Rather than swim in a sea of dead metaphors — desktops, files, docks, trash cans — we must make computation strange again. “Habit dulls the instrument,” Tenen notes. “The tool recedes from view.” Make strange, or an automatic discourse will take command.

And who better to make strange than the poets? As Percy Bysshe Shelley would say, new poets rise in order to create fresh associations between things. When the territory of human character changes, poets rewrite the map. Or else we await a horrific fate: “language,” so says Shelley, “will be dead to all the nobler purposes of human intercourse.” (Recall Heidegger’s remark that cybernetics transforms language into an exchange of news.) Poetry disrupts our systems of understanding. And today, our age is distinguished by the fact that such systems of understanding are no longer abstract, airy “discourses” but are reified into actual machines. We carry our systems around in our pockets. In this familiarity lies a danger. In our computer age, Tenen writes, we risk becoming alienated from the actual hardware and material contingencies of information storage, gaining access to “metaphor alone.” This is why Tenen calls for a “poetics” of computation — close reading of underlying levels of code. We must lay bare the device.

Indeed, our relationship to writing has changed, but it’s not all the computer’s fault. In a sense, the computer only reified a more immaterial historical shift. Or, as Deleuze notes, machines don’t determine our situation, but “express the social forms capable of producing them and making use of them.” So it’s no surprise to find that the literary-critical notions of “technique” and “device” grew up alongside the ultimate symbol-manipulating machine, the computer. Explication de texte gave way to computation de texte. “Materialist poetics rise concomitantly alongside a mechanistic, rule-based view of language,” as Tenen notes. This insight comes in the middle of the book’s most interesting chapter, in which Tenen historicizes the emergence of the formal mechanisms of computation with a series of thought experiments posed by Wittgenstein and Alan Turing. By blending media archeology, the history of ideas, and literary criticism, Tenen shows us that “computation” belongs to a much broader category of thought than the actions of one particular type of machine. In this view, Alan Turing was a literary theorist just as much as he was a mathematician. His famous idea for a computer, like the old medium of the book, is merely a generalized machine for symbolic manipulation.

For Tenen, this mechanistic conception of reading and writing has radical implications for the future of human understanding. Studia humanitatis was built on the ancient mode of analysis known as textual exegesis. But how can we continue the practice of interpretation — indeed, how can we practice the humanities at all — if we no longer understand what our writing does? At the heart of Tenen’s book is the surreal realization that our modern reading and writing systems — word processors, PDFs, ebooks — have more in common with digital smoke detectors than they do with leather-bound books, which is to say that they are all governed by invisible lines of code. Unlike words written in ink on paper (where what you see is what you get), the digital stage of textuality is unstable, beholden to the laws and constraints of computer code, legal contracts, and encrypted protections. For this reason, we must go beyond mere reading, says Tenen, to computational poetics — that is, an awareness of the infrastructure that “stage the construction of meaning.” We must treat our new devices differently, not simply as neutral platforms on which we perform the old style of literary analysis. “Consider the possibility of interpretation as we know it being a historical anomaly, connected to the contingencies of print,” Tenen notes.

If this is so, the literary-critical study of “format” gains a newfound urgency. “Formats shape the very structure of interpretation,” Tenen writes. Everything might exist in order to end up in a book, but a book is no longer necessarily the same object that we used to stack on our bedside tables. When books become electronic, they become devices, with terms and conditions attached. A million lines of code control the conditions of our consumption. “The literature device adapts itself to the situation — to the needs of both the owner and the user of the book — by hidden logics,” writes Tenen. In the midst of all these codes, we are in danger of succumbing to Kittler’s dire warning: “we simply do not know what our writing does.”

¤


Tenen comes down somewhere in the middle of the long-running debate in the textual scholarship between so-called depth and surface. Famously, scholars such as Sharon Marcus and Stephen Best advocated for an analysis of literature that dealt with surface forms (“just reading”) as opposed to the “deep” archeological exercises of Marxist or Freudian critics, who plumb the depths of textual artifacts in order to reveal some latent or hidden ideological meaning. One of the key points of Marcus and Best’s call for surface reading is that it no longer takes a PhD in critical theory — that is, the sophisticated techniques of ideological analysis — to know that the world is fucked. Tenen echoes this in his introduction, when he asks the question “who shares the page”? But for Tenen, Best and Marcus’s argument has a different valence when applied to the mystical shells that we call computers. He is not interested in calling out or extracting some hidden, nefarious ideological content. We’ve quite literally accepted the imposition of control when we clicked “ACCEPT” on the terms and conditions page. Rather, Tenen’s position is something akin to a surface reading of material depth. He is less concerned with sociology — that is, the political context that grounds the devices — than he is in unraveling or understanding the form, format, and methodologies of the things he analyzes.

It is important to note that Tenen is no technophobe, and that one won’t find the usual arguments “against” the use of computer technology in the pages of Plain Text. The phrase “digital humanities” occurs a mere six times throughout the book. In fact, Tenen defends the digital humanities not against some other, older form of exegesis but as a passing fad, hardly a threat to anything at all. “Digital photography, digital clocks, and digital humanities already ring archaic in their futuristic ambition, going the way of e- or i- anything, the way of retro suffixes such as -bot, -mat, -lux, and -tron.” Human character isn’t threatened by “being digital.” In a certain sense, written language has always been digital: “A text that can be copied and preserved is more digital, in a sense, than one limited in its circulation, whether by nature or design.” Rather, it is threatened by an increasingly selective illiteracy.

This is what I mean when I say that human character changed in 1971. But unlike what Kittler says, this change did not arise because of some apocalyptic End of Literature. Rather, it was because a selective illiteracy was taking place. “We — readers, writers, interpreters — find ourselves today in an unprecedented, since the Middle Ages, position of selective asemiosis: the loss of signification.” By this, Tenen means that we must not be naïve or willfully ignorant about the material conditions of information exchange. Despite the general public’s familiarity with computing interfaces, it still seems illogical to refer to the public as “digital natives” if they remain ignorant of the deep levels of code governing such inscription technologies. In this sense, the real shift in human relations occurring at present is not between husbands and wives, or parents and children, but between the Intel engineers and the rest of us mere “computer users.” Rather than digital natives, many literary scholars are exiles living in the Intel engineers’ world while operating according to the logic of the previous century, of Mr. Bennett and Mrs. Brown.

Ask yourself: When I write on a computer, where does that writing reside? Is it there, on the word document, or is it buried deep inside the hard drive? Does literature actually end in cryptograms, in the layers of encoding deep beneath the liquid light of a laptop screen? These are the questions at the heart of Dennis Tenen’s important and engaging book. We’re lost among the copper, the cables, the formats, the terms and conditions. Tenen’s Plain Text provides a lucid and legible map to our often vertiginous computational climate. We do not know what our writing does — at least not until we convert our files into plain text.

¤


James Edward Draney is a writer and critic. He holds a BA in History of Art from University College London and an MA in Literature, Culture, and Theory from King’s College London. He currently studies literature and philosophy at Duke University.

LARB Contributor

James Edward Draney is a writer and critic. He holds a BA in History of Art from University College London and an MA in Literature, Culture, and Theory from King’s College London. He currently studies literature and philosophy at Duke University.

Share

LARB Staff Recommendations

Did you know LARB is a reader-supported nonprofit?


LARB publishes daily without a paywall as part of our mission to make rigorous, incisive, and engaging writing on every aspect of literature, culture, and the arts freely accessible to the public. Help us continue this work with your tax-deductible donation today!