“In a way, nobody sees a flower really, it is so small, we haven’t time—to see takes time, like to have a friend takes time.“
Benedictine monks never imagined that their new technology, designed to help workers unwind, would eventually wrap workers around the axle. William Farish never imagined that his technological innovation would make his profession meaningless. New technologies are wonderful in what they promise to do, yet we are often “incapable of imagining what they will undo,” media analyst Neil Postman wrote in Technopoly (Random House, 1993).
Twitter has been described as perhaps one of the greatest technological innovations since the telegraph. It will better connect us. Perhaps. But what will it undo?
Benedictine monks invented the mechanical clock in the 12th century to remind workers to take periodic Sabbath breaks. They never imagined someone like Frederick Taylor, known as the Father of Scientific Management, would use clocks to time workers in order to increase productivity.
William Farish (a Cambridge University tutor) never imagined his idea of numerical grading—unheard of before his time—would eventually marginalize mentoring. Before 1792, students were evaluated through dialogue, not digits. This conversation required a tutor. Numerical grading has wiped out mentoring.
Now consider Twitter—a wonderful new technology promising us the world. It can do a lot. What might it undo? How about paying attention?
In Proverbs 1:24, Wisdom calls out on the streets—yet people refuse to listen. The parallel rephrasing in the verse tell us why: “I stretched out my hand and no one paid attention.” Paying attention is an important piece of gaining wisdom. That’s important to recognize, since all communication is, by its very nature, an interruption of attention. That includes TV, Twitter, text messaging, telephones—and simple conversations. Even “In the beginning . . .” was an interruption—a disruption of prevailing assumptions at that time. It still is. Here’s our challenge: Too many interruptions reduces the ability to pay attention.
Initially, interruption wasn’t a problem, since until recently, communication was conducted through technologies that were polite—that waited until you were ready to pay attention. The posted notes on the town square. The letter in your box.
But beginning in the early 1800s, technologies accelerated interruptions. It was a period of dazzling technology advancements, a “communications revolution,” writes historian Daniel Walker Howe in What Hath God Wrought: The Transformation of America, 1815-1848 (Oxford University Press, 2007). Americans were conquering “the first enemy” (to use Fernand Braudel’s phrase): distance. Samuel Morse invented the telegraph in 1849. By the 1870s, 650,000 miles of wire and 30,000 miles of submarine cable had been laid. A message could be sent from London to Bombay and back in as little as four minutes, according to Tom Standage in The Victorian Internet: The Remarkable Story of the Telegraph and the Nineteenth Century’s On-Line Pioneers (Walker Publishing Company, 1998). Before long, it was more and more messages, faster and faster. Speed was everything. But it became hard to pay attention.
You can draw a line from telegraph to telephone to TV to text messaging to Twitter. We are continually interrupted today. On average, both children and adults look away from a TV up to 150 times an hour. “Push and pull, back and forth, television is in essence an interruption machine, the most powerful attention slicer yet invented,” writes Maggie Jackson in Distracted: The Erosion of Attention and the Coming Dark Age(Prometheus, 2008).
Well, maybe. Text messaging and Twitter have to be right up there. The results are indisputable however. When the TV is on, children ages one to three exhibit the characteristics of attention-deficit syndrome. When adults continually tweet and text, they exhibit less ability to pay attention to important yet complex truths.
We now live in an age of what some call “inattentional blindness.” Researchers Arien Mack and Irvin Rock studied people who talked with someone in another room through the use of a microphone while simulating the driving experience. In the test, most drivers failed to see a girl on the screen that darted out between cars—they exhibited “inattentional blindness.” When we are constantly inattentive, as when we’re continually twittering and tweeting, we can miss the big picture. In fact, many “no longer accept the possibility of assembling a complete picture of reality,” writes literary critic Sven Birkerts. Now that’s a problem we cannot ignore.
Inattention is being observed everywhere. In one study (reported on by James Gorman in The New York Times), participants are asked to watch a one-minute film where six people, half dressed in black and the other in white, pass a basketball around. The viewers were asked to count the number of times a team passed the ball. Almost half the people failed to see a woman dressed in a gorilla suit who calmly walked through the group. Her scene lasted over nine seconds as she paused to beat her chest. In the same way, Wisdom stands before us everyday and stretches out her hands. But we have to pay attention to acquire it. That’s not easy, since we now live in a world perfectly designed to make us inattentive.
Our present age is a cacophony of confused categorizations. The younger we are, the greater our confusion, according to Mark Bauerlein of Emory University in his The Dumbest Generation: How the Digital Age Stupefies Young Americans and Jeopardizes Our Future (Tarcher/Penguin, 2009). Bright, intelligent, hard working college students increasingly demonstrate an inability to sort through the onslaught of information so easily accessed by web-based technologies. Bauerlein says only 16 percent of today’s students read the text on a web page line by line, word for word, and can pull together a coherent summary of what the author intended to say. The other 84 percent can only pick out individual words and sentences, “processing them out of sequence,” Bauerlein concludes. They lack wisdom because they can no longer pay attention for sustained periods of time.
At the end of the day, this is not a diatribe against technology. I own an iPhone. I text message. I’m not a Luddite. But I’m not texting every waking moment. Yes, there are stories of good connections being made by the availability of communications technologies such as Twitter. But do the exceptions prove the rule?
Paul Goodman in the New Reformation says technology was once considered a branch of moral philosophy, not of science. In the nineteenth century, science usurped philosophy’s role with the attitude that “if something can be done, it should be done.” Moral philosophy would counter that just because something can be done does not necessarily mean we ought to do it. Because the gospel is no longer considered by many to be a coherent definition of reality, we have no moral boundaries left for wise use of technologies. Instead, people sleep with their phone left on—don’t want to miss any breaking news! The sad part is, these people often miss the Big News.
The problem, as Postman pointed out, is that “once a technology is admitted, it plays out its hand; it does what it was designed to do.” And undo. Postman predicted a “thought-world that functions not only without a transcendent narrative to provide moral underpinnings but also without strong social institutions to control the flood of information produced by technology.” That’s worth considering as more and more people twitter their lives away—or at least twitter away wisdom.