In the weeks following the St. Louis Cardinals’ National League Championship Series loss to the San Francisco Giants, my son Elijah and I frequently talked baseball. Our beloved Cardinals had been up three games to one in the seven-game set, and then lost three in a row. Struck us as long odds.
The same scenario had played out in 1996 when the Cards lost the NLCS to the Atlanta Braves. “That was when Gary Gaetti hit a grand slam off Greg Maddux,” I said as I remembered lip-reading the pitcher’s screamed obscenity on national television. In my mind I can still see him storming off to the left of the mound and pacing around for a minute before being removed by Bobby Cox. My mental imagery is sharp of Maddux’s frustration not only in that situation but in others—red-faced and sweaty, he’d push back his cap, frowning like a boy—gather himself, take the mound, stare in to the catcher. One of the best finesse pitchers of his generation, perhaps of all time, he will likely be inducted into the Hall of Fame in 2014. What had been the odds of a B-slugger like Gaetti hitting a home run against him?
So Elijah and I also talked about probability in general. Baseball, an untimed sport, lends itself to such talk. Baseball moves by events—out by out, inning by inning. Theoretically a baseball game could go on for days, even months. Sitting at the Wooden Nickel Pub in Hillsborough, waiting for our Reubens to arrive, we wondered how long the longest baseball game in history had been. We wondered how high the highest score had been. We wondered these things and, since we both have iPhones, we instinctively began to get them out. I stopped: “Wait. Let’s not.” He said, “Okay,” thought for a second and added, “But what’s the point?” I capitulated. We had the technology, could learn in thirty seconds the answers to both questions, so why not?
Last night, sitting in the same pub with my friend Adam, I noticed Jeff Buckley’s “Hallelujah” had begun to play on the jukebox. I asked Adam if he’d seen World Magazine editor Marvin Olasky’s recent call for new lyrics to the song. “New lyrics?” Adam asked, a bit shocked. “It’s supposed to be an ironic song, isn’t it?” I began to recap Olasky’s editorial, we carped about Christians wanting to Christianize everything and then talked about Leonard Cohen, his faith, and so forth. This is how conversations unfold in the real world. Adam mentioned that Buckley had died relatively young, in a bizarre accident. He pulled out his Android phone (he’s a software developer for Microsoft) and read aloud the account of Buckley’s drowning in “a channel of the Mississippi River”—the words of Wikipedia. Suddenly I could hear, in the voice on the jukebox, the sadness of death to come.
At the next table sat a group of five women evidently having a baby shower or birthday party. A gift bag occupied one of the chairs. Each woman would occasionally look down at her smartphone. I noted, at the end of their gathering, phones lying atop to-go boxes and almost made a Twitter post of it but couldn’t quite frame it right. Also, there was something in me that wanted to observe the phones on the boxes and just let them be there, in their natural habitat. Let their irony, whatever irony they contain, remain in my mind and memory.
But I’m compelled to inscribe—if not to inscribe, to know. The human race must be busier than ever, knowing and retelling, talking and reading, liking and scrolling, deleting and adding—friending, following. Our intelligent, eminently portable devices make such compulsion possible. Inside the busyness, though, lurks an emptiness, at least as I’ve experienced it.
I’m 41 and remember when cable television came out, when VHS was introduced, and Blockbuster first opened a store in our neighbourhood. I remember the advent of MTV. I remember a darker time before that, when no song had a “video,” every phone had a spiraling cord that led to a wall-mounted or table-top telephone, men wore polyester shirts and loud neckties and sweet cologne. It now seems such a dim and mysterious time. I remember, in the late ’70s, climbing the steps to the balcony of my parents’ house in Des Peres, Missouri, and listening to Jackson Five and Bobby Gentry records, or Superman 45s.
In 2003, when I became a high school English teacher, I installed an Onkyo turntable in my classroom, and within a week its diamond needle was broken. Students had been making like DJs, scratching records and beatboxing. They approached the turntable, with its clear plastic counterweighted lid, tone-arm, and spinning platter, as a novelty, a relic. But even that was nine years ago, before smartphones appeared. Encyclopedia Britannica was just getting its multimedia act together. Wikipedia was only two years old, and students were already plagiarizing from it.
Ah, traditional reference. Never did I imagine, even in the early 1990s when I worked for Macmillan Publishing Co. and my friends were signing up for their first AOL accounts, that all that information and connectivity would one day be available through a pocket device. I mean, all of it, and, for those of us with dexterous, uncalloused fingers, accessible within seconds. Never could I have imagined that those Baker & Taylor and Ingram’s warehouses full of pallets full of cardboard boxes full of books would, in twenty years, be rendered practically unnecessary. But they are now so, if not obsolete. We had half-expected a Jetsons future with flying cars, servo-droids, automatic food. We had expected the physical world to be the primary locus of change. We thought robots were on the horizon, not ubiquitous information and media.
I’m not worried about these changes anymore. I used to be and have been, on and off for the past fifteen years. Growing up I’d handled so many books—memorized so many poems and acted in so many musicals and plays. I’d accepted the mantle of poet with the assumption that words, good words, were rare and required cultivation, that they smelled like paper. That they could be folded and enclosed in letters. I liked to sit in a carrel deep in the stacks and daydream, look at things. Places mesmerized me, and I participated in them existentially, their aromas, textures, qualities of light. People’s faces and clothing fascinated me—the subtleties of their dialects, hair styles, the social play between them. I was an observer and, to whatever extent possible, a writer of what I observed. There was a time when I liked just being in places. Driving across the country, listening to cassettes or whatever radio we could pick up, it was just us and the scenery. Now that I’m permanently connected, I almost can’t imagine that anymore. I have six or seven old fountain pens whose tips are dry and scratchy. Every once in awhile I resuscitate one, but within a month it’s dry again.
In the summer of 2008, Nicholas Carr (according to Wikipedia, “an American writer who has published books and articles on technology, business, and culture. His book The Shallows: What the Internet Is Doing to Our Brains was a finalist for the 2011 Pulitzer Prize in General Nonfiction”) published a fine essay in The Atlantic titled “Is Google Making Us Stupid?” His lament sounds like mine:
Over the past few years I’ve had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory. My mind isn’t going—so far as I can tell—but it’s changing. I’m not thinking the way I used to think. I can feel it most strongly when I’m reading. Immersing myself in a book or a lengthy article used to be easy. My mind would get caught up in the narrative or the turns of the argument, and I’d spend hours strolling through long stretches of prose. That’s rarely the case anymore. Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I’m always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle.
Okay. In the fifth paragraph I wrote, “Inside the busyness lurks an emptiness,” then later said, “I’m not worried about these changes anymore,” and then I quoted Carr. I realize that what I mean to say is that I’m trying not to be worried about these changes, that I’m doing my best to, as the kids say, “roll with them”—perhaps even to subsume them under the header, How I Roll. I’m trying to make sense of them in my poetry, adapting to the quick-cutting, hyper-referential, fact-based mode of Internet-sourced mental processes. Though I’m not technically an Alt Lit type of guy, I like to play with the tone of encyclopedic authority, to parody the cocky blog comment, the ill-conceived tweet, suffuse lyrical verse with net lingo. I tinker with lines of poetry as if I’m sending myself text messages, linking to my own mental pages. So, for me, for my art such as it is, the decline of what Carr calls “deep reading” has become an aesthetic asset. I allow my verbal constructions to be a thin ice of oddly specific attentiveness floating on an ocean of inarticulate memory and emotion. In my poetry I let myself see and re-see, say and revise and revise again, as though I’m clicking through my own Wiki version history. It’s, you know, palimpsestic. It’s also Modern, as we see in T.S. Eliot’s “Love Song of J. Alfred Prufrock”:
Time for you and time for me,
And time yet for a hundred indecisions,
And for a hundred visions and revisions,
Before the taking of a toast and tea.
For Eliot et al, the progress from horse-drawn carriage to steam engine to automobile (“And if it rains, a closed car at four” he writes in The Waste Land), from tailored clothes to sack suits, from Pony Express to telegram, had been the dawn of a new cut-and-paste world that would eventually yield Imagism and Cubism and Abstract Expressionism, among other cultural and philosophical changes.
So yeah, I’m cool with this, I think. May Walt Disney’s hyperactive Tomorrowland and Jules Verne’s brass-bolted machines and Ray Bradbury’s fear of totalitarian apocalypse settle into a rolling boil of everything accessible but nothing exceptional, and may we merely eat ourselves. May the 52,310-ton Titanic of Modern grandiosity sink not into postmodern cynicism but into a new sense of how introspective we might yet become while seeming alert and almost entirely external. May we fit the first half of Timothy 3:7, “always learning,” while avoiding its second, “but never able to arrive at a knowledge of the truth.” At least may we be always clicking and reading, always knowing, eminently aware if for no other reason than that information is always available to us at ever-increasing speeds via multiple forms of communications technology. And may we meme into eternity.
Sitting at the Wooden Nickel, in my physical body, wearing my particular clothes, I’m presented with a different, or at least complementary, range of possibilities. There’s one tall server who laughs when he mishears orders. There’s one bearded bartender, shorter and a bit heavier set, who moves nimbly through the crowd of patrons and seems to know everyone personally; when he throws a bottle or can at the recycling container behind the bar, from whatever distance, he hits it and privately pumps his fist. This reality was not available via the Internet until now, and even these words do not contain it. They point to it, but inadequately. They’re a sketch.
So I am okay with the Internet and the new immediacy of connection—am, as it were, signing a provisional peace treaty with it—for the same reason I accept language itself. Both present thousands, millions of small signals that swarm about real experience. The swarming is inadequate, and often superficial, but that doesn’t mean it ought to be rejected. Part of accepting new media, for me, has been coming to terms with the encompassing truth that there’s no such thing as “pure” human communication—what Ludwig Wittgenstein termed “ideal language.” Language is competent to its task in a basic sense (“Let’s meet at seven”) but there will always be a fullness that eludes it. T.S. Eliot touches upon this in “Burnt Norton”:
Words strain,
Crack and sometimes break, under the burden,
Under the tension, slip, slide, perish,
Decay with imprecision, will not stay in place,
Will not stay still. Shrieking voices
Scolding, mocking, or merely chattering,
Always assail them.
Christian theology has historically shared this sense that our words can never capture the Word. The word of God Himself, the word that brought all of this into existence, happens on a frequency above ours. The “secret and hidden wisdom of God,” writes Paul in Second Corinthians, cannot be seen, heard, or imagined by humans without “God [having] revealed [it] to us through his Spirit.” And yet what does the Spirit give us to reveal this? The Word in words. The Word becomes flesh and meets us where we are—physical creatures in a physical world.
As it turns out, the longest game in MLB history happened relatively recently, in Chicago, in 1984. White Sox outfielder Harold Baines ended it with a home run in the bottom of the twenty-fifth inning. It was a two-day affair. According to BaseballReference.com, it was also the longest single outing by a catcher; the White Sox’s Carlton Fisk played all twenty-five innings. The modern record for most runs scored against an opponent in a single MLB game happened in 2007 when the Texas Rangers crushed the Baltimore Orioles 30-3. Elijah and I did, via iPhone 3G, discover these facts whilst awaiting our Reubens. I recently learned that the Reuben, by far my favourite sandwich, is a twentieth-century invention of uncertain origin. Wikipedia reports several possible “first instances” of the Reuben.
Although I’m more interested in eating a Reuben, I’m okay with learning about its history by sliding my finger around a small glass screen, even in public. I conclude that this practice, when performed before dinner at the Wooden Nickel, is more a matter of etiquette than what Carr calls “remapping neural circuitry.” If Elijah and I agree that we want some answers, why not? It doesn’t particularly interfere with our dinner.