Computation de Texte

And unlike in 1910, human character didn’t merely change. And today, our age is distinguished by the fact that such systems of understanding are no longer abstract, airy “discourses” but are reified into actual machines. By blending media archeology, the history of ideas, and literary criticism, Tenen shows us that “computation” belongs to a much broader category of thought than the actions of one particular type of machine. This, after all, was what Woolf’s “serious joke” about human character was actually about. He is not interested in calling out or extracting some hidden, nefarious ideological content. “Materialist poetics rise concomitantly alongside a mechanistic, rule-based view of language,” as Tenen notes. Does literature actually end in cryptograms, in the layers of encoding deep beneath the liquid light of a laptop screen? Tenen’s Plain Text provides a lucid and legible map to our often vertiginous computational climate. Automatic discourse analysis has taken command. With this gnomic pronouncement, Kittler meant that we’ve lost sight of the material contexts of our knowledge production. Tenen, a literary critic who, like Kittler, also trained in computer programming, understands what is at stake in our evolving forms of reading. But on the other hand, to Kittler’s credit, our machines are not merely inanimate. This is where Plain Text begins to diverge from the litany of other books about our massive breach of privacy, such as Bernard Harcourt’s Exposed and Frank Pasquale’s The Black Box Society. In the midst of all these codes, we are in danger of succumbing to Kittler’s dire warning: “we simply do not know what our writing does.”
¤
Tenen comes down somewhere in the middle of the long-running debate in the textual scholarship between so-called depth and surface. This is what I mean when I say that human character changed in 1971. Little did Woolf know that human character would change once again — not in 2010, as Mendelson has it, but a mere 47 years after Woolf’s diagnosis (her essay appeared in 1924). Thankfully, the critic clarified himself, however subtly, at the end of his magisterial study of 19th-century analog technology, Gramophone, Film, Typewriter. Contrary to Kittler’s diagnosis, we are always writing, whether we know it or not. “Every technological revolution coincides with changes in what it means to be a human being,” he writes, “in the kinds of psychological borders that divide the inner life from the world outside.”
In his piece, Mendelson gives us an eloquent — if highly familiar — account of our communications society and its political pitfalls: mass surveillance, a disappearing inner life, increasing social anomie, and so on. The production of language, once seen as the very essence of human activity, became relegated the automatic process of a machine. By 1971, the metaphysical niceties on which Woolf based her half-serious assessment had imploded. In our computer age, Tenen writes, we risk becoming alienated from the actual hardware and material contingencies of information storage, gaining access to “metaphor alone.” This is why Tenen calls for a “poetics” of computation — close reading of underlying levels of code. But for Tenen, Best and Marcus’s argument has a different valence when applied to the mystical shells that we call computers. Is it there, on the word document, or is it buried deep inside the hard drive? Everything might exist in order to end up in a book, but a book is no longer necessarily the same object that we used to stack on our bedside tables. A million lines of code control the conditions of our consumption. Our writing may have disappeared from view, but that doesn’t mean that we’re not living in a frenzy of inscription. Writing (or at least literature) was not dead yet. “Formats shape the very structure of interpretation,” Tenen writes. What we need is radical defamiliarization, and such is the project outlined in Plain Text. When the territory of human character changes, poets rewrite the map. This insight comes in the middle of the book’s most interesting chapter, in which Tenen historicizes the emergence of the formal mechanisms of computation with a series of thought experiments posed by Wittgenstein and Alan Turing. Bennett and Mrs. They are black boxes, opaque, which we accept into our homes as the Trojans accepted their horse. In mass culture, it’s easy to spot the symptoms: high-tech horror shows like Black Mirror or the emergence of “digital detox” centers for the email addicted executives of Silicon Valley. Though like any good retort, it takes the notion that “we simply do not know what our writing does” with the utmost seriousness. We carry our systems around in our pockets. This is the question at the heart of Plain Text. For the computer is a different kind of device, unlike a car or a spoon or a hammer, in that it is epistemic. In a sense, the computer only reified a more immaterial historical shift. “Habit dulls the instrument,” Tenen notes. Writing — that most consequential form of human communication — gave way to something else, according to media theorist Friedrich Kittler. This is why Shklovsky’s notion of estrangement proves to be an important technique for our digital age. In short, Alexa is always listening. The last historical act of writing may well have been the moment when, in the early seventies, the Intel engineers laid out some dozen square meters of blueprint paper (64 square meters in the case of the later 8086) in order to design the hardware architecture of their first integrated microprocessor. Mass surveillance is common knowledge in our post-Snowden world, and yet we have yet to banish our devices en masse. Tenen, like the formalist critic Viktor Shklovsky, argues that we have become far too automated in our experience. For Mendelson, our technological revolution (which he dates about a century after Woolf’s, in 2010, when everyone took up a smartphone) was obscuring, or at least altering, human nature. But metaphor, in the case of the computer, obscures more than it reveals. Rather than swim in a sea of dead metaphors — desktops, files, docks, trash cans — we must make computation strange again. And who better to make strange than the poets? This second change was far subtler, some would even say banal, almost imperceptible, and it was diagnosed not by a novelist but by a critic. The phrase “digital humanities” occurs a mere six times throughout the book. And Tenen might be right: literary criticism could be just the ticket for our high technological age. Oddly enough, the key tool for rewiring our relationship to computation (and the various inscription devices that we’ve installed in our lives) is not politics or cultural theory, but good old-fashioned literary criticism. With his usual blend of gleeful cynicism, Kittler noted that “we simply do not know what our writing does.” With the advent of the microprocessor, humans no longer left metaphysical traces on paper — they simply rearranged lines of code on a word processor. “Consider the possibility of interpretation as we know it being a historical anomaly, connected to the contingencies of print,” Tenen notes. But how can we continue the practice of interpretation — indeed, how can we practice the humanities at all — if we no longer understand what our writing does? NOVEMBER 25, 2017
WHEN DOES TECHNOLOGY cease to seem like magic, and become more like a dark art? Explication de texte gave way to computation de texte. It was about writing — all the effort of the novelists and poets to analyze, arrest, and make beautiful the series of chaotic accelerations that we now call modernism. Or, as Deleuze notes, machines don’t determine our situation, but “express the social forms capable of producing them and making use of them.” So it’s no surprise to find that the literary-critical notions of “technique” and “device” grew up alongside the ultimate symbol-manipulating machine, the computer. Woolf’s essay on Mr. “The literature device adapts itself to the situation — to the needs of both the owner and the user of the book — by hidden logics,” writes Tenen. Rather, it was because a selective illiteracy was taking place. In order to disrupt the nefarious ease with which we adapt to our machines, Tenen turns to the Russian Formalists. He holds a BA in History of Art from University College London and an MA in Literature, Culture, and Theory from King’s College London. He is less concerned with sociology — that is, the political context that grounds the devices — than he is in unraveling or understanding the form, format, and methodologies of the things he analyzes. When books become electronic, they become devices, with terms and conditions attached. In this sense, the real shift in human relations occurring at present is not between husbands and wives, or parents and children, but between the Intel engineers and the rest of us mere “computer users.” Rather than digital natives, many literary scholars are exiles living in the Intel engineers’ world while operating according to the logic of the previous century, of Mr. ¤
James Edward Draney is a writer and critic. “Structures of digital control often advance by metaphoric substation,” writes Tenen. Despite the general public’s familiarity with computing interfaces, it still seems illogical to refer to the public as “digital natives” if they remain ignorant of the deep levels of code governing such inscription technologies. These are the questions at the heart of Dennis Tenen’s important and engaging book. For Tenen, the techniques of literary analysis are far more effective as political tools than any banal, agit-prop list of digital privacy breaches. “Finished, it’s finished, nearly finished, it must be nearly finished,” says Beckett’s Clov. But he — and most other humanists — tend to miss the far more consequential shift in human relations that is happening right before our very eyes, namely, that our relationship to writing is changing. For Tenen, this mechanistic conception of reading and writing has radical implications for the future of human understanding. Even up here, in the cloistered realm of academic literary studies, a nascent technophobia is emerging, disguised as a crisis (or resurgent interest) in humanism. Tenen aims to reposition us readers in a world where the word has moved from the page to the wires. We do not know what our writing does — at least not until we convert our files into plain text. Unlike words written in ink on paper (where what you see is what you get), the digital stage of textuality is unstable, beholden to the laws and constraints of computer code, legal contracts, and encrypted protections. Famously, scholars such as Sharon Marcus and Stephen Best advocated for an analysis of literature that dealt with surface forms (“just reading”) as opposed to the “deep” archeological exercises of Marxist or Freudian critics, who plumb the depths of textual artifacts in order to reveal some latent or hidden ideological meaning. One of the key points of Marcus and Best’s call for surface reading is that it no longer takes a PhD in critical theory — that is, the sophisticated techniques of ideological analysis — to know that the world is fucked. Bennett and Mrs. It was obliterated. […] 0.1 percent flow though the transmission, storage and decoding machines of the National Security Agency (NSA) […] and hence the end of history, like nothing else. Or else we await a horrific fate: “language,” so says Shelley, “will be dead to all the nobler purposes of human intercourse.” (Recall Heidegger’s remark that cybernetics transforms language into an exchange of news.) Poetry disrupts our systems of understanding. ¤
The trick, for Tenen, is to avoid Kittler’s tendency of mistaking things — computers — for animate actors. Ask yourself: When I write on a computer, where does that writing reside? Perhaps we can say the same thing of writing: nearly finished, but not quite. A life story can be written with a series of clicks. We are far too comfortable among our devices: smart phones, smart toasters, smart refrigerators, PDFs, ebooks, and fitbits. Studia humanitatis was built on the ancient mode of analysis known as textual exegesis. We must lay bare the device. Our new media have become like “a stage disappear[ing] from view”: “Our challenge today is to uproot ourselves from the comforts that rapidly descend on the dwellings of our intellectual life.” If literary scholars have any hope of understanding human character, we have to proceed through writing technologies themselves: in our case, through keyboard, copper, and silicon, to liquid crystal, and the floating gate. He currently studies literature and philosophy at Duke University. Mallarmé’s contention that everything in the world exists to end up in a book has never made as much sense as it does in our digital-archival age. As a result, what used to be solitary activities have become public in ways we cannot always perceive. For this reason, we must go beyond mere reading, says Tenen, to computational poetics — that is, an awareness of the infrastructure that “stage the construction of meaning.” We must treat our new devices differently, not simply as neutral platforms on which we perform the old style of literary analysis. At the heart of Tenen’s book is the surreal realization that our modern reading and writing systems — word processors, PDFs, ebooks — have more in common with digital smoke detectors than they do with leather-bound books, which is to say that they are all governed by invisible lines of code. We’re lost among the copper, the cables, the formats, the terms and conditions. According to Kittler, writing, under the dominance of new silicon-based technologies, became hidden from view, never to return to human perception. If the world has become a book, it makes sense to argue that a literary scholar should be the one to try and save it. Rather, Tenen’s position is something akin to a surface reading of material depth. In fact, Tenen defends the digital humanities not against some other, older form of exegesis but as a passing fad, hardly a threat to anything at all. Brown. In the age of ubiquitous computing, users need to adopt what Tenen calls a “systematic minimalisism.” Hence his title: plain text is not just a file format — it is a mode of relation, an interpretive stance that makes sure we understand what our own writing is doing. With the advent of the microprocessor, what we used to call writing becomes nothing more than information processing: “Trenches, flashes of lighting, stars — storage, transmission, the laying of cables.”
Dennis Tenen’s Plain Text: The Poetics of Computation is written against Kittler’s bleak assertion. Rather than stand in awe of our devices, we must stare them down. Kittler’s cries were meant as a warning shot, rather than a prediction. For Kittler, the end of metaphysical traces — writing — meant the end of literary study and hermeneutics. “[Computers] do not just get us from point A to point B; they augment thought itself, therefore transforming what it means to be human.” Unlike Kittler, Tenen — a self-proclaimed humanist — believes that we can take hold of our technological environment. Brown was not about history, politics, or the character of one’s cook. But unlike what Kittler says, this change did not arise because of some apocalyptic End of Literature. And under this new technological regime, human character began to disappear, like a face drawn in sand at the edge of a silicon sea. As Percy Bysshe Shelley would say, new poets rise in order to create fresh associations between things. Tenen echoes this in his introduction, when he asks the question “who shares the page”? “The tool recedes from view.” Make strange, or an automatic discourse will take command. If this is so, the literary-critical study of “format” gains a newfound urgency. It is important to note that Tenen is no technophobe, and that one won’t find the usual arguments “against” the use of computer technology in the pages of Plain Text. We’re fed these alien control systems with the promise of familiarity: we “flip pages” in ebooks, we place “files” in “folders” on our “desktops.” But such metaphoric forms of organization serve to obscure as much as they facilitate ease. In this view, Alan Turing was a literary theorist just as much as he was a mathematician. Take Edward Mendelson, who, when called to write about new media for The New York Review of Books in 2016, chose to resurrect Virginia Woolf’s “serious joke” that on or around December 1910, human character changed. In this familiarity lies a danger. Of all long-distance connections on this planet today. “Digital photography, digital clocks, and digital humanities already ring archaic in their futuristic ambition, going the way of e- or i- anything, the way of retro suffixes such as -bot, -mat, -lux, and –tron.” Human character isn’t threatened by “being digital.” In a certain sense, written language has always been digital: “A text that can be copied and preserved is more digital, in a sense, than one limited in its circulation, whether by nature or design.” Rather, it is threatened by an increasingly selective illiteracy. Take the problem of metaphor. Indeed, our relationship to writing has changed, but it’s not all the computer’s fault. “We — readers, writers, interpreters — find ourselves today in an unprecedented, since the Middle Ages, position of selective asemiosis: the loss of signification.” By this, Tenen means that we must not be naïve or willfully ignorant about the material conditions of information exchange. And this is exactly what Tenen argues, with brilliant clarity, in the introduction to his new monograph. “Where do words reside?” he asks. His famous idea for a computer, like the old medium of the book, is merely a generalized machine for symbolic manipulation. A digital pacemaker keeps the heart beating, but it also stores and transmits its data, inscribing our personal information in some distant database, half a world away. Kittler afforded technology too much agency. It simply had nothing more to say:
[Literature] ends in cryptograms that defy interpretation and only permit interception. For our panoptic condition is hardly a secret. We’ve quite literally accepted the imposition of control when we clicked “ACCEPT” on the terms and conditions page. We have traded “critical understanding for comfort.” These devices, rather than vehicles or tools for reading and writing, have become inscription surfaces. As our dependency on new media grows, our fascination wanes, and a collective anxiety begins to take hold.