JUNE 7, 2011
Illustration: M.C.Escher, Man With Cuboid
EVERY SO OFTEN SOMETHING will break through the stimulus shield I hold up whenever I go online, which I do far too often these days, we all do, and for various reasons, one being, I’m sure, that the existence of the medium has created an unremitting low-intensity neural disquiet that we feel only the medium can allay — even though it cannot, never has. But it is an attribute of the Internet to activate in me, and maybe in all its users, a persistent sense of deferred expectancy, as if that thing that I might be looking for, that I couldn’t name but would know if I saw, were at every moment a finger tap away. That is the root of the addiction right there — and it is an addiction, sure, if only a lower-case one. To bear all this, therefore, to proof myself against the unstanchable flow of unnecessary information and peripheral sensation, I make use of this shield, which is really just an attention-averting reflex, a way of filtering almost everything away, leaving just the barest bones of whatever I happen to be looking at, and these only in case some tell-tale name or expression requires me to peer a bit more closely.
I practice this defensive, exclusionary scanning not only with the incidental flotsam I encounter — the inescapable digests of happenings in the world, celebrity divorces, killer storms, and so on — but also, more and more, with texts about subjects that ostensibly concern me. A recent case in point — I have it handy now because I finally printed it out — is an article I found online at The Awl called “Wikipedia and the Death of the Expert” by Maria Bustillos (posted on May 17, 2011). It came to me via several clicks at one of the so-called “aggregate” sites I sometimes visit to keep myself “informed.” I scan a great many articles in the course of my daily tours, but I am not avid. More often I scroll my eyes down the screen with a preemptive weariness — which is an angry and defensive posture, I agree — as if nothing truly worthy could ever be found online (I know this is not true), as if I will have conceded something to the opposition if I were to fully engage the Internet and profit from the engagement.
Reading online is, we know, a keyword-driven process, and the reader (this reader) has to exert near-constant mental counter-pressure — drive with his foot on the brakes, as it were — if he is to read words on screen in the way that he once, when younger and more assiduous, read words in books. The editors of Bustillos’ article clearly understood this, and so, instead of engineering anything that would work as a speed-bump, they laid the piece out for us fast-lane drivers, with short paragraphs and a way of link-highlighting whatever sense-nuggets appeared, so that one could either click and delve, or hasten forward. For instance, a reference to the journal Naturedid not merely underline that one word, but also the phrase beside it — Nature stood by its methods and results — so that the eye almost irresistibly vaulted to the next nearby link, giving, at least in the opening pages, a choppy précis of what the writer was saying. Gist: Wikipedia is the one-stop reference du jour, an excellent tool, well-supervised, accountable, offering three main advantages over the print “ancestors,” these being — and on she went at bullet speed.
After this, however, there comes an easing of the incessant hyperlinking and a sense that the main discussion is being engaged, which is good for Bustillos, as my lateral propulsion had me all but lifting off the page. The Wikipedia pitch was obvious enough, but now it appears that more than the efficacy of Wikipedia is at issue. With minimal transition, Bustillos introduces cyber-theorist Bob Stein, identified as the founder and co-director of the Institute for the Future of the Book, who then serves a kind of emcee function, invoking Marshall McLuhan as the thinker to reckon with on the question of digital collaboration. All sorts of potential links have been left unhighlighted, and I find myself decelerating, getting more engaged. Marshall McLuhan. Here is the old master, the original media pundit, whose name I can never encounter without flashing back fifty years to dinner parties my parents hosted. I would eavesdrop: all this talk about the “global village” and “the medium is the message.” I remember the vibration in the air — this was something new, this had people going.
Now Bustillos has Stein recalling McLuhan, adapting him to his particular purpose — or is it Bustillos adapting Stein? — working the article to build toward his dramatic assertion, which I will get to, which I want to put across as the culmination it was, it being the sentence that lanced through my screen and got me so agitated. But there is something like a world-view that needs to be set out first.
Once McLuhan has been introduced, Bustillos enlarges the field of action, quoting the man from his 1969 Playboy interview:
The computer [thus] holds out the promise of a technologically engendered state of universal understanding and unity, a state of absorption in the Logos that could knit mankind into one family and create a perpetuity of harmony and peace.
I can’t but feel some dissonance here. How not? I’m reading these words in May of 2011, when it takes the fingers of both hands to count up the world’s conflicts and hot spots of dissent, sites where “understanding and unity” have completely collapsed. But this is not the real point. And because the topic is, nominally, Wikipedia, and because Bob Stein has been brought in to raise the stakes, and because there is a technological vision being put forward that can no longer be discounted, I read. I can ignore the harmony and peace part of the formulation, but the other, the idea that we are being almost irresistibly gathered into a “technologically engendered state,”that has a renewed heat.
This “state,” for me, is the issue, the situation, and the crisis, and something about this casually strung-together article — its references, assumptions and attitudes — incites me in ways that other, possibly more reasoned and articulate presentations have not done.
Having claimed McLuhan for her side (the sides originally set up as Wikipedia versus Brittanica), Bustillos moves to take on the idea of the expert, and, implicitly, authority itself. She starts by referencing McLuhan’s years at Cambridge, when he was studying to be a literary critic (!), how he fell under the influence of the New Critics. “Before these rationalists came on the scene,” she writes, “literary criticism had a mystical character rooted in the Romantic ideas of guys like Walter Pater…” That “guys” says a great deal about the status of that cultural legacy for Bustillos (or else about me and my sensitivity to insult). But this is not her point. Her point is that McLuhan, through his exposure to the “English Department renegade” F. R. Leavis, “developed the beginnings of the life-long distaste for ‘expertise’ and ‘authority’ that would come to characterize his work.”
Keeping in mind that this is an article promoting Wikipedia not only as the new authority-less authoritative reference source, but as the paradigm of the rapidly changing way of things, the McLuhan invocation makes more sense. After sketching in McLuhan’s early development as a media theorist, his recognition about how technologies alter the structure of our thinking, and the powerful influence on his thought of Canadian thinker Harold Innis, who was known less for his reasoned argumentation than for offering his reader a kind of mosaic of facts, perspectives and surmises, Bustillos proposes that the various elements of McLuhan’s approach,
the abandonment of “point of view,” the willingness to consider the present with the same urgency as the past, the borrowing “of wit and wisdom from any man who is capable of lending us either,” the desire to understand the mechanisms by which we are made to understand — are also cornerstones of intellectual innovation in the Internet age.
She then asks and simultaneously asserts “How neatly does this dovetail into a subtle and surprising new appreciation of the communal knowledge-making at Wikipedia?!” McLuhan, so long mothballed as a Founding Father, has been re-purposed as the patron saint of this anti-expert “communal knowledge-making.”
At this point Bustillos makes an abrupt shift of focus. She brings her lens to rest on a contemporary media thinker, Jaron Lanier, and his recent critique of what he calls “the hive mind,” which is in essence the collective process she has been celebrating. Lanier, in his important 2006 article “Digital Maoism,” and much more recently in his book-length manifestoYou Are Not a Gadget, has stood up for both individuality and authorship. I had not, until recently, been aware that these were demon-concepts. But in some quarters they clearly are. Of his position, Bustillos quips: “…reading his stuff is like watching a guy lose his shirt at the roulette wheel and still he keeps putting everything on the same number.” Here I am far less interested in Bustillos’ reasoning, which is mainly that of the leap-frogging enthusiast, than I am in the assumptive tone, the manner, the confidence. It is a tone we often hear in the voices of those who believe their historical moment has come. It dares a mocking intonation, a casual dismissiveness: Pater is a “guy” and Lanier, standing up for individuality and authorship, well, he too is a “guy.”
I would not fasten so readily on Bustillos’ breezily casual mode if it did not seem to be, beyond a baiting tactic, a harbinger of new attitudes. How better to get after any stance or idea than by first de-dignifying its proponents? Tolstoy — who cyber-agitator Clay Shirkey dismissed as unreadable two years ago — becomes the guy who wrote all those long battle scenes, Michelangelo the guy who painted all those great abs on people…
But here the man under attack is one who has ventured to question the ultimate value of Internet group-think, and who has the temerity to speak for the importance of the individual subject. Why does she choose Lanier? Because he has for years been a Silicon Valley insider, one of its central thinkers, and heretical assertions like his — from within the fold — cannot be countenanced. The speed with which various bedrock human assumptions — the value of the individual self, e.g. — are lapel-flicked away I find breathtaking. Bustillos finds it “difficult to see how Lanier … will be able to keep this sort of thing up for much longer. Michael Agger took Lanier’s book to ribbons in Slate: ‘[Lanier’s] critique is ultimately just a particular brand of snobbery. [He] is a Romantic snob. He believes in individual genius and creativity…'” Again, we need to note which terms are being tagged derisively.
Bustillos is not done. She now again quotes Bob Stein, this time his review of Lanier’s “Digital Maoism” article:
At its core, Jaron’s piece defends the traditional role of the individual author, particularly the hierarchy that renders readers as passive recipients of an author’s wisdom. Jaron is fundamentally resistant to the new emerging sense of the author as moderator — someone able to marshal “the wisdom of the network.”
The author rushes in behind Stein: “Events have long ago overtaken the small matter of ‘the independent author,'” she writes. “The question that counts now is: the line between author and reader is blurring, whether we like it or not.” I can’t quite parse out the question there, but I see that Bustillos believes that the traditional place of the author is being superseded.
In the world according to 2.0, these are deemed to be some of the big changes of our moment. Expertise, authorship, individual creativity: out. Team collaborations, Wikipedia: in. Inevitably: “Knowledge is growing more broadly and immediately participatory and collaborative by the moment.”
And now I come to it, without as much of a drum roll as I’d hoped, the last word, the formulation that penetrated my stimulus screen. Bob Stein gets the last important words, but what an epigram they make. Simply: “The sadness of our age is characterized by the shackles of individualism.”
I leave a space here — a moment in which to re-read and ponder those words.
It may seem odd that I’ve spent so much time summarizing and quoting from a web-posted pro-Wikipedia polemic, but this piece struck me, got down under my skin more irritatingly than most, and I needed to understand why. I think I do now. Here, in one place, I find not only a number of the issues I have been worrying for some time, but also, as I’ve suggested, some of the attitudes and assumptions that inform the situation, compose the climate in which the transformations are taking place. This “climate” has been the hardest thing to isolate for reflection, for it is a totality, an environment, a cultural Zeitgeist. I have tagged it for myself with a re-phrasing of the outworn idiom. “What if the elephant in the room,” I ask, “is the room itself?” Reading Bustillos’ article was the closest I’ve come to identifying that vast intangible for myself. It was there, as much in her definitional jockeying, her style of differentiating between worldviews, as in the thematic implication of the ideas themselves. Reading it, I thought: here are the pieces I need.
“Wikipedia and the Death of the Expert.” The title is as good a place to begin as any other. I note the immediate polarization of concepts — Wikipedia and expert — and the arresting announcement of the death of the latter. And though Bustillos does not establish causality — she has not called it “How Wikipedia killed the expert” — some of that implication inevitably attaches. “The king died and then the queen died is a story,” wrote E.M. Forster in his well-known distinction between a story and a plot. “The king died, and then the queen died of grief is a plot.” What Bustillos’ title offers as a story, is, for me, a plot. A causal narrative. Whatever term we decide on, it is a serious matter, one that fills me with some of the queen’s grief.
Let me address the main business straight on. Big as the Wikipedia question is — the question of the collaborative production of information — there are deeper issues still, issues for which Wikipedia versus Britannica, Bustillos’s comparative point of departure, is only the outer sign. And indeed, Wikipedia versus Britannica is not really even a viable polarization. After all, both are, though in clearly differing ways, collectivized enterprises looking to deliver accessible expertise to users. Bustillos’s real agenda, which she gets at by way of issues of said expertise and of collaboration, is to lay out two diametrically opposed conceptions of the human and then, in effect, to cast her vote. Here we have the split, the road-fork issuing in two paths that would with every step take the pilgrim on one further from his counterpart on the other. There is no eventual convergence. The one is the path — the ideal — of the individualized self, the other is the path of the socially and neurally collectivized self, along which, at some undetermined point, the idea of “self” itself must blur away, become a term no longer applicable.
There have been various iterations of this latter idea, starting perhaps with theologian Teilhard de Chardin’s spiritualized imagining that there will one day exist what he called a noosphere, a kind of rapture belt of merged human identity girdling the planet. Then there was Forster’s prescient depiction, in his story “The Machine Stops,” of a world of beings living in isolated cells, interconnected by a communications network that is uncannily like the Internet. And, much more recently, we have media theorist Kevin Kelly’s variously expressed ideas about the “hive,” a world in which the electronic connections between people have fused to become a quasi-nervous system, bringing about a kind of cognitive collectivism. It was Kelly, too, who has been most vigorous in advancing the idea of the universalized book: all the world’s texts and data scanned into one vast keyword-searchable database.
Kelly was most certainly the thinker Lanier had in mind when he spoke out against the dangers of the hive mentality. As he put it in “Digital Maoism”: “the hive mind is for the most part stupid and boring.” And: “The beauty of the Internet is that it connects people. The value is in the other people. If we start to believe that the Internet is itself an entity that has something to say, we’re devaluing those people and making ourselves into idiots.” Kant this is not, granted, but Bustillos exhibits nothing but sneering contempt for the expression, responding: “I guess if we started to believe that the Internet itself were writing Wikipedia we would be in some trouble, or maybe we would be Rod Serling, I don’t know.”
Electronic collectivism has very quickly gone from being a sci-fi imagining to being a plausible scenario that more and more people, at least those active in the computer-culture, would endorse for us all. It will be objected that the ambitions of the cyber-sector don’t have that much to do with the life of the culture at large. But one could similarly say that the decisions made by a few thousand members of the investment banking community don’t affect us either. In fact, there is a connection between the ideas held by that minority and the lives that the rest of us live. The connection is technology, and McLuhan framed it early on:
It is not only our material environment that is transformed by our machinery. We take our technology into the deepest recesses of our souls. Our view of reality, our structures of meaning, our sense of identity — all are touched and transformed by the technologies which we have allowed to mediate between ourselves and the world. We create machines in our own image and they, in turn, recreate us in theirs.
The sage of Toronto, quoted by Bustillos, encapsulates a great deal here, and it is exactly to the point. The cyber-sector, numerically a minority, could be said in important ways to have a majority voting interest so far as the development, promotion and implementation of technology goes. These are the engineers and marketers behind the enormously influential I-technologies, all of the daily-more-sophisticated screen devices that have come to seem indispensable to people the world over — from cell phones to tablets to reading devices of all descriptions. Backed by huge corporate interests, marketed through the global media, these interactive devices (and their consumer images) exert massive collective — and collectivizing — effects, and for the very reasons McLuhan hypothesized. We use them in prescribed ways and they determine not just our obvious external reflexes, our ways of doing business, but they also seep into our deeper selves, what McLuhan quite surprisingly calls our “souls.” And in this way, without even officially signing on to hive-oriented behavior and thinking, we begin to manifest it.
The point is that these technologies are not used in instrumentally isolated ways. Rather, they create a community of users and a complexly self-reinforcing culture of expectations. This culture, this environment — how well we know it — becomes ever more difficult to step away from; and it has various socially coercive implications. Consider the obvious case in point: the cell phone in its current incarnation. What was at point of origin a distinct-use, one-to-one voice-transmitting device has become a mind-boggling locus of seemingly indispensable functions, allowing multiple lines of connection between users, giving access to the great data-stream of the Internet, and subtly and not-so-subtly creating new expectations, like that of reachability, of locatability. To own a cell phone is to register yourself in the family of all users; it is to take your place in a network of indeterminate complexity, announcing, in effect, that you are in technological “range” 24/7. Hardly 1984, I agree. But it’s worth considering the ways in which we are conditioned by our systems interactions. The cell phone is only one such system, the Internet is another, and the economy, as we enter it through our credit-cards and ATM transactions, is a third. These are just salient instances. Each of these systems, though we rarely think of it this way, yields itself to us by way of numbers and codes; and each by way of other codes dictates the sequence of our actions, the levels of access we are allowed, and whether or not it will function for us, or not.
Letters, numbers, codes — the new coins of the realm. Of course there are effects and consequences. Engaging these systems, we learn right away that the codes and numbers — our identity proxies — facilitate our movement through the electronic slipstream. We don’t really believe that we become some string of digits or password; we accept this as part of how the system works, just as we accepted long ago our personal all-purpose social security number. Possibly, though, we have noticed that the more we transact our activities via Internet circuits, the less we ever interact with a person, even electronically. Almost all “situations,” be they purchases, reservations, account inquiries, trouble-shooting needs, or whatever, are dealt with by our filling in mandatory fields, delivering data and, again, codes.
And, of course, every such delivery of data further adumbrates one’s virtual profile — describes more precisely one’s preferences, habits, and ailments, so that, as users of the Internet (and this is old news) we are addressed ever more often, and with more honed-in specificity, by way of this pseudo-personhood. How many times daily am I now addressed as “Dear Sven” by software programs behind which, I am certain, no intending individual lurks? This is all such a familiar scenario, much of it long ago imagined for us — and thereby, curiously, not intensified but defanged — by various anxious dystopian intellects. So that instead of worrying, “Oh dear, we were warned of a Brave New World by Aldous Huxley,” we will tune into more or less the same scenario served up as spoof on a cable channel, and say, “Well, if television thinks this is a laugh …” Never mind that television thinks almost everything is a laugh and in the space of a 24-hour news cycle can turn most any crisis or catastrophe into skit material or, in the adjacent gaming world, into a first-person shooter program. How do we cope with this growing alienation, this intensifying dissociation from what was formerly the human sphere? Human nature, like Nature itself, abhors a vacuum — so quite understandably we respond by reaching more avidly toward the people we know, or at least have some proxy contact with, and — here’s the rub — we often do so by way of the screen. Thus we deepen and extend the circuits.
Obviously I’m speaking in vastly generalized hypothetical terms, trying to get a bead on what I am calling “the room that is itself the elephant in the room.” Maybe I should say a bit more about that little koan-like formulation. What I am getting at is my sense that the biggest transformation that has befallen us — that we have ourselves helped bring about — is so total in its nature, and so driven by unseen elements (circuits and signals), that we have no point of purchase for talking about it, and therefore don’t; or else, no less distressingly, we imagine that because we cannot see it, cannot put a finger directly on it, nothing much has changed, and that we are still moving about in the status quo ante. (Or status quo eleph-ante.)
An elephant should be identifiable, even if it is by way of its various isolated attributes, as happens in the famous cartoon with the group of blindfolded men. One of these is the change in the status of the “expert.” Bustillos argues that knowledge, and the presentation of knowledge — its argumentation — is shifting away from the hierarchical and analytic mode, and becoming more and more an open field of discrete elements, which can be searched and assembled according to need and use. She harkens to the modernists, to McLuhan and his mentor Harold Innis, and ends up espousing the Wikipedia model, a group-generated assemblage that can be updated and modified at any time. From the vantage of the Wikipedia project, the idea of the lone individual — thinker, scholar — is quixotic at best, readily mocked as a hubristic hold-over from a bygone era, as “old school.”
There’s no question that the world is much changed. If the exceptional scholar could once hold great quantities of the world’s information comprehensively in mind, she does so with less and less ease and confidence today. The computer, meanwhile, has enabled new levels of interactive collaboration, and Wikipedia is one calling card. But to suggest from that instance that the era of individual initiative in all fields has passed, that the notion of expertise has become laughable — well, this is another of those intellectual climate-transformations, those Zeitgeist effects, that I’m talking about here. Bustillos seems to be not so much advancing her own argument here as giving voice to assumptions now abroad in the land, at least certain zip code areas of it. Her assertion is tinted with, if not yet saturated by, the collectivist premise by which the private self is an outmoded concept, perhaps elitist, threateningly counter-collective. In this vision, to quote Bob Stein again, “the sadness of our age is characterized by the shackles of individualism.”
What has happened? Let’s not forget that we were all very recently immersed in the opposite, in the culture of self-actualization. The period leading up to the millennium was, among other things, the Therapeutic Age, millions of us bent on finding and empowering the sovereign “I.” It can’t be that in a few short years we flipped from heads to tails. One could make the argument that even as actualization fever carried the day, the cyber-grid was being assembled, its new procedures gaining ground, and that the implications, though latent, were registered. One could even argue, granting that that latency was broadly intuited, that the great push toward “self’ was to some degree exacerbated by the specter of mass existence, and represented a preemptive surge. Possibly it was not preemptive enough. About forces and effects so widespread and inchoate we can only speculate.
Though there are any number of shifts and transformations to be reckoned, all having to do with why the room feels so very different, I will have to let the some stand for the many. But one more element, no less significant, needs to be discussed. It was not a part of Bustillos’ broad strokes picture, but I will make it part of mine. I mean the arrival of the “cloud,” that epic new development in computing.
We hear about cloud computing everywhere these days. It represents the next great leap forward. Premised on a dispersed systems network, cloud computing liberates digital material from hard-drive transmission and creates a free-floating collectively-accessible data saturation. Among other things, this digital saturation hastens the across-the-board obsolescence of a whole class of physical item. We are all witnessing the sudden dematerialization of our arts and entertainment, their transfer from unique artifact source to universally on-demand screen availability. Walk down Main Street. The video and DVD emporia are gone — it happened in a year or two. The bookstores, if not yet folded, are quickly going the way of record stores. More and more people are persuaded to access their culture through screen portals, ordering up what they need for their Kindle, their iPod, their nightly watching pleasures. And the middle men, the algorithmically nimble purveyors of books and music and film, increasingly access them — us — identifying what they think we want and laying it on our electronic doorstep.
Of course there are advantages to all this. Instant access and narrowcasting to gratify our preferences. But there are enormous losses, too. The main one, front and center, call it another pachyderm, is the public and private disappearance of the physical evidence of our tastes and desires. There will be no records or CDs on the shelves of the future, few if any books. Everything will live in bits, in files — and how can this not modify the general atmosphere? We are removing the physical markers of culture from our collective midst. For a record store was not just a place to get records, as a bookstore was not only for finding the needed read. These were sites where the love of music and literature announced themselves across a spectrum of tastes. And though they were commercial entities, these emporia also symbolized the presence, the value, of their product to the community.
What distresses me about the transfer from thing to cloud — it was Karl Marx who lamented that “all that’s solid melts into air” — is not just the loss of the object, the fetish, the thing, but also the larger thematic implications. Of course we will never dispense with physicality altogether: even the characters in Forster’s extreme parable had bodies that lived in cell-like structures. But the primal materiality that governed the terms of existence is being by degrees, quick degrees, put at a distance. In his book The End of Nature, Bill McKibben argued with some persuasiveness that we have tamed nature and domesticated the idea of it. Nature is now for vacations or high-priced adventurings; or else it is, for the fortunate majority of us, a catastrophe spectacle, something else for the AOL home page slide show: tsunami, tornado, calving iceberg.
We might try on the big picture for a moment, imagining the terms of physical existence as they were a hundred years ago for the average person, and comparing these with the present. I won’t itemize, though I could. The short version is that the world, its elements, its nouns, has receded, as has its intractability, the defining obstacles of time and space. It’s almost as if world and screen were in inverse relation, the former fading as the latter keeps gaining in reach, in definition, in its power to compel our attention.
Everything about the “room” has changed, and this fact is what makes it so hard to speak of the room.
There is also the vital and vexing question: are collaborative, collectivized ways of living entirely at odds with subjectivized ones? Doesn’t it make sense to imagine a world where we use systems and circuits to do what they do best and indulge our clamoring selfhood in our ever-more-abundant spare time? It sounds good: The Jetsons meets Abraham Maslow. But, alas, it also sounds like a version of the old “it’s just a tool” argument. Maybe things could work that way if our electronic living were not the saturation that it is fast becoming. But if we grant McLuhan’s point, that we “take our technology into the deepest recesses of our souls,” and that our “view of reality, our structures of meaning, our sense of identity — all are touched and transformed …” then we cannot find easy comfort in the both/and perspective.
We need to voice and then address the if. If it is so — if via its almost irresistible systems and skillfully marketed product ideologies, combined with shrinking available venues for private initiative and a broad-based acceptance of the idea that there is nothing at all unnatural about human beings living in steady self-reinforcing electronic milieus — if this is what we are steering toward, preening ourselves as we quash dissent with ridicule, taking lightly the human outcomes — what then?
This is obviously a fear phrasing itself as an unanswerable question. It marks the place I had thought to stop, except that I now feel a nagging sense of incompleteness — structurally, if not metaphysically. I did, after all, introduce the image of the diverging paths, one of which I claimed was moving so ominously toward collectivized, dematerialized screen living. What about the other? It’s somehow not enough just to say that the other is the abandoned way, some pre-digital place of living where we might have remained, but didn’t. The deeper issue is more whether there is still a choice, and, if so, what that might be, and how would a person live it through?
This, I would say, is the vexation of life these days, and has been for some years. How to make room for the private self, how to defend its admittedly intangible claims in the face of so much change? I have no one thought in response, but many. But I will stay with my most recent contemplation. This derives in part from Rainer Maria Rilke, from his Duino Elegies, which I picked up again the other day with the feeling that there was something there I wanted. And there was.
I don’t think I could deliver a summary explanation of the Elegies here. Packed with thought and spirit matter, Rilke’s ten long poems take on the biggest human questions, often proceeding via what feel like obsessive digressions about love, death, self-making; if any writer has put the existential self at the core of his work, it is this poet. And if there is a 180-degree reversal of the human picture Bustillos presents, Rilke is its herald.
Rilke’s watchword, the driving concept of the elegies, is transformation. He sees our tenancy on earth as fragile; he registers an anxiety which, though it predates the information age, is uncannily like the anxiety many of us live with every day. But where the impulse of our age is clearly toward instrumental mastery, toward what is, in effect, the invention of a parallel realm in which we all collaborate and, perhaps, move toward some kind of social merging, he offers up the difficult other course. Instead of turning from the demands imposed on individual being — which is to say, at root, solitary being — he urges fronting that world, taking it in, suffering it, and in the process, though with no guarantee of success, transforming it.
This idea, akin to Keats’ notion of “soul-making,” is the gist of his great ninth Elegy. The language is at once mystical and intimate. Posing the question “why / have to be human, and, shunning Destiny, / long for Destiny?” he responds:
…because being here amounts to so much, because all this Here and Now, so fleeting, seems to require us and strangely
concern us. Us the most fleeting of all. Just once,
everything, only for once. Once and no more. And we, too,
once. And never again. But this
having been once, though only once,
having been once on earth — can it ever be cancelled?
— (tr. J.B Leishman and Stephen Spender)
The turn here, the vital moment, is in Rilke’s saying that it is the world that needs us. What ever can he mean? It is almost as if existence were in some way a collusion between levels of animate being, our consciousness and the very different sentience of what he calls the creature world, of Nature — as if our philosophical and psychological and spiritual purpose were to bringthat world into consciousness, raise it. But not collectively, into a noosphere, and not digitally, into a cloud of data, but subjectively, inwardly, into language. Some lines further, in one of the famous passages from this Elegy, he asks:
Are we, perhaps, here just for saying: House,
Bridge, Fountain, Gate, Jug, Olive tree, Window, —
possibly: Pillar, Tower? … but for saying, remember,
oh, for such saying as never the things themselves
hoped so intensely to be.
Earth, isn’t this what you want: an invisible
re-arising in us? Is it not your dream
to be one day invisible? Earth! Invisible!
We could not be further from Bustillos and her advocacy not just of Wikipedia, but what she sees as our newly ascendant mode of living — and yet, yet, there is something both “visions” invoke, though in very different ways. This is the transformation of the old material given, the world, our natural origin. The digital path would move us away by building a new world, with new human rules, and placing it squarely atop the old. Rilke, speaking out of his age, drawing on his Nietzsche, Rodin, Cezanne, his genius influences, asks that we take the world in, swallow it in our living, and then labor to spin it into the stuff of a higher awareness.
Both scenarios, I know this, sound bizarre, radically dissociated from the life of the moment any of us are likely to be in. Asked whether he believed that objects only existed when perceived, Samuel Johnson famously kicked a large stone and said: “I refute it thus.” So we sit with our cup with its damp tea bag, our hatched-over day care schedule, wincing from the paper cut we just administered to ourselves, and we deem our living a clear refutation of both hive-life and transcendent subjectivity. But don’t we also know other moods, other mind-states, occasions when it strikes us that the world is indeed changing, and changing in ways that escape our easy reckoning, but which sometimes waken in us, depending on the day, depending on our nature, either bursts of quiet exaltation, or else premonitions of some deeper dread?