If we know we’re trapped against our will, we get angry, like a wild raccoon hissing from a trap. It’s for this reason that zookeepers have become adept at designing cages that don’t look like cages at all. These enclosures or habitats keep the animals in, the people out — everyone’s happy. They’ve learned that the most effective cages are the ones you don’t see.
In The Glass Cage, Nicholas Carr worries that we’ve built our own invisible enclosures with the various forms of machine automation that permeate our lives. As machines do more for us and increase in power to shape our lives, we will continue to experience significant changes in human work, culture, and life. And without sufficient critical engagement, we face the prospect of significantly unwelcome consequences. Recall the actor in The Truman Show who eventually sailed his boat into the wall of the massive set to confirm his growing suspicions that his entire world was inside the invisible cage of a reality television show. In a similar fashion, Carr’s book is a sail-boat bump into the edges of our own digital confinement.
What makes Carr’s argument so interesting is that it is neither apocalyptic nor alarmist. The glowing red eyes of Cylons don’t figure in the argument; the machine invasion that interests Carr is much more subtle.
We have built automation into our various mechanistic systems for a long time. Think, for example, of autopilot for airplanes, spell-check for word processors, and computer-designed buildings. If we take a moment to reflect, we all recognize these as attempts to aid humans in their work. But with use and the passage of time, these processes become less and less visible to us.
They’re still there, but we don’t notice them. Human—not just digital—flourishing requires that we take a moment to critically consider the consequences of turning over more decision making to the algorithms that drive our machines.
Machines Aren’t Tame
I have always been fascinated with machines and showed early signs of an affinity for understanding them. In my preteen years I thought of little else than airplanes and motorcycles. I remember taking the motor out of our Yamaha Trimoto 125, splitting the cases to remove the crankshaft, replacing a bearing, and then managing to put it all back together successfully. I love the elegance, precision, and adventure that machines represent. But as I worked with machines— especially the “automatic horses” that we used to farm 1,200 acres of land—I also learned how strong, dangerous, and disruptive these clever inventions could be.
Now automation is as present in the skies over the prairies as it is on its wheat fields. Carr’s case study of the role of automation in the aviation industry shows just how deep our dependence runs. In 1947 an American C-54 Skymaster military transport was guided into the air by Captain Thomas J. Wells and then flew across the Atlantic and touched down without Wells’ assistance. Today even small recreation aircraft have far more sophisticated avionics and flight controls than that experimental C-54 and the result is significantly safer air travel and a significant reduction of our dependence on the skill of human aviators.
Between 2002 and 2011 there were two deaths per million passenger flights. Between 1962 and 1971 there were 133 deaths per million passenger flights. This is an astounding improvement in air travel safety.
This works well almost all of the time, but, as Carr points out, it’s not foolproof. The crash of a Q400 passenger flight between Newark and Buffalo in 2009 and an A330 flight between Rio de Janeiro and Paris were the result of grave errors made by pilots who responded improperly to cues from their autopilot programs. The result was a total loss of aircraft and all on board. In both cases, the automated systems did what they were designed to do: disengage due to icing problems. But the response of the pilots led to disastrous results. The point for critical reflection is how the role of the human actor changes as a result of automation.
The problem is not that automation equals misery or disaster. It doesn’t. The problem is that we have become less adept at recognizing how machine automation is changing our work, our relationships, and what we accept as a full life. The more our machines have the power to shape us, the less we seem to reflect on their effects. So Carr is calling for more deliberation in our design.
Designing For Agency
From the planning of a building to the arrangement of furniture in a living room, design has a wide range of meanings. At the heart of design is the idea of human agency in action. If design can be defined in this manner, then automation is a means of extending that agency to machines. I need to wake up earlier than I want to, so I impart some of my agency to a little machine that keeps track of time and can automatically make noise just when I need to wake up. I agree to the exchange of agency because the machine is designed to be better than I am at watching the time and being unthinkingly loud and rude in the morning. Machines represent obsession in the extreme. And that often turns out to be useful.
When we give a machine some of our work, we gain speed, efficiency, safety, and consistency. But as Carr provocatively points out, we lose human skills, capacities, and some unseen cultural benefits of work that are integral to our wellbeing. Take spelling for example. Research by Dennis Galletta, a University of Pittsburgh business professor, shows that using spellcheck erodes your spelling and grammar skills. This applies whether you are a skilled or unskilled grammarian or speller. Galletta’s work has shown that automatic spelling programs cause us to second guess our own decisions, to make changes that are incorrect, and simply to miss glaring mistakes.
Automatic spelling and grammar checking is a visible automation: you see annoying little red lines when you spell automatic instead of automatic. But it’s more troubling when machines make decisions without our knowledge. In many cases we might not even be aware of what we’re losing. Tobacco companies were forced to put warning labels on their cigarette packages because the effects of smoking are invisible for a long time, but when the consequences are visible they are severe. Public interest dictated that the gap between use and consequence be made more visible, that our decision to smoke today was linked to the outcome of that decisions decades later. Does automation require a similar kind of label? Do people care whether they are the agent or the object?
A Hierarchy Of Automation
It’s important to note that automation is not an either/or equation. Rather, it represents a spectrum of possibilities. Below is a scale of how automation expert Raja Parasuraman shows how automation changes the degree of human/machine decision making.
The LOW side is where human civilizations have primarily operated. As the sophistication of our machines increases, movement up the spectrum seems inevitable. I wonder if we might benefit from an application of such a spectrum across the various dimensions of our human experience, identifying what we want and do not want, of what it costs us to move up or down that scale.
And there are real costs. This is the hazard of automation: decisions are made and processes enacted that we may not see that have unintended negative consequences. And there are often factors that automated machines, by design, simply cannot take into consideration. “What is lacking cannot be counted” as Ecclesiastes reminds us.
Automation is not primarily about function but about ecology. As Neil Postman and Ivan Illich note, adding a new technology to an existing process doesn’t give you the same process with a bit more technology. Instead, it initiates a new process, a new ecology. In the “important” film, Chicken Run, we find an illustration of this. When Mrs. Tweedy buys a new chicken pie machine she doesn’t have her former egg laying operation plus a chicken pie machine, she has a whole new operation. Her relationship to the chickens changes—they are no longer sustained as ongoing egg layers, but become the grist for her chicken pie machine that needs a steady supply of new raw material. The slow-of-mind Mr. Tweedy senses this, reminding him of the past generations of egg farmers from which he and his wife came, but he can’t stop it.
Or take library services here in Hamilton. I can take out and return dozens of books over months and years and never need to talk to a librarian. A robot takes my books, scans them, and then moves them downstairs on a conveyor belt. When I check out books, the RFID tag links the book to my account and notes it as checked out. It’s fast and easy. Machines also generate lists of books for me based on algorithms that associate one thing with another devoid of anything but machine context. Like pilots who used to be aviators, librarians who used to be stewards of books are now managers of complex robotic systems. Are we aware of the implications of this?
It’s at this point that the various paragons of anti-technology can be of service to us. Both the Amish and the Luddites are held up as stereotypes of opposition to technology despite that characterization being untrue in both cases. They are not opposed to machines, technology, or automation per se. The Amish are concerned about how the presence of machines changes who we are and shifts the ecology of human thriving. In the case of the Luddites the core of the issue was not technology, but how that technology further eroded their power and transferred it to the owners of that technology. Likewise with the
Amish, it isn’t about technology (they make extensive use of machines of all kinds from hoes to wagons to thickness planers) but about the potential disruption to their society and culture that every technological addition represents. If it isn’t fully understood, considered, and sifted, then it is treated with suspicion. As observers like Michel Serres have argued for decades, new technology leads to a different community, not just the same community plus the technology.
The Amish assume that the community is what we need and that technology is optional. We assume that we need more technology and that community is optional. We accept the automation of our lives with little time devoted to considering the effects of that automation or the trade-offs that are at play. Our institutions, social bonds, habits of mind, relationship to work, craftsmanship, art, religion, and education are all changed through the transfers (small and large) of decision making from humans to machines. Carr notes that we are learning that automation decreases our adaptation and introduces unintended consequences. Clearly this isn’t a question of saying yes or no to the technology represented by machine automation of one kind or another— being human is integral to making creative use of ideas, objects, and materials. The question is how we might see more clearly and understand more fully the points at which handing off decisions to machines undermines rather than promotes our humanity.
What we must cultivate is a habit of calling the automation of various parts of our lives what it is: a glass cage that is built and willingly inhabited by the first generation and then accepted without question in the next. We should ask at each step what we gain and lose with each pane of glass added to our cage. If we engaged such critical discussions more frequently and deeply, we would be joining with the spirit of the Wright brothers who, Carr points out, believed that the pilot did matter, that the most elegant and complete experience of flying occurred when both the pilot and the aircraft mattered. The way to experience flow in our work, or when we are doing things like driving or riding a bike, starts when our agency forms a harmonious relationship with a machine. Robust discernment can help us know when that balance is getting out of order.
Tapping On The Glass
Let me suggest a tangible course of action to improve our awareness. A fisherman can learn how to catch more fish by gutting those he has caught. By opening up the fish, he learns what the fish eats, and can adapt his bait and habits to increase his likelihood of catching more of that fish. I recommend a similar approach to our relationship with technology.
Here are three things that you might do to engage in discernment by dissection.
First, buy a tiny computer like an Arduino that is built on open source hardware and software ($30). Go online and learn about it, hook it up to your laptop, and load some software on it to make a light blink or a light sensor buzz. When you see the very elemental aspects of this kind of physical computing, you will appreciate the intricacy and see the limitations of computing. You’ll be a small robot farmer of your own. Growing your own carrots in the backyard won’t make you a self-sufficient farmer any more than writing code for an Arduino will liberate you digitally, but in both cases you’ll become tuned to some of the important questions about farming and computation that directly touch your life.
Second, disassemble the devices that form the matrix of digital interaction around you. You can learn about how they are made, how they are recycled (or not), and how the instructions we give them animate their functions. This can be a significant help in asserting our own agency and demystifying their power.
Finally, take up the habit of watching for automation. It’s all around you. There are a lot of things that have automated aspects —your house’s thermostat, your car, and even the boiler plate report cards that teachers use. Observation helps us weigh the merits and malignancies of the machines that live among us. Ask yourself: how much of my function is affected when a machine is impaired? When you lose your phone, is the disruption major? If you have no internet access at work for an hour or two, do you easily shift to other modes of working or does paralysis set in?
When we surrender without notice the agency that makes us human in exchange for the speed, efficiency, or comfort afforded by machines, we barter away opportunities for soul satisfaction in our work, the chance to develop deep skills, and the profoundly important sociability of our work. It may well be that agency, like skills, declines with disuse. Learning how to detect when automation is costing us more than it is worth must become a more enriched habit for us as individuals and as members of the many cultural spaces we share. If indeed our cage is glass, then there is hope that the stone in our hands may yet be put to good use.