Skip to main content Accessibility help
×
Hostname: page-component-8448b6f56d-t5pn6 Total loading time: 0 Render date: 2024-04-24T14:21:47.398Z Has data issue: false hasContentIssue false

2 - The Faulty GPS

from I - Distraction by Design

Published online by Cambridge University Press:  30 May 2018

James Williams
Affiliation:
University of Oxford

Summary

Type
Chapter
Information
Stand out of our Light
Freedom and Resistance in the Attention Economy
, pp. 7 - 11
Publisher: Cambridge University Press
Print publication year: 2018
Creative Commons
Creative Common License - CCCreative Common License - BYCreative Common License - NC
This content is Open Access and distributed under the terms of the Creative Commons Attribution licence CC-BY-NC 4.0 https://creativecommons.org/cclicenses/

Five years ago I was working for Google and advancing a mission that I still admire for its audacity of scope: “to organize the world’s information and make it universally accessible and useful.”1 But one day I had an epiphany: there was more technology in my life than ever before, but it felt harder than ever to do the things I wanted to do.

I felt … distracted. But it was more than just “distraction” – this was some new mode of deep distraction I didn’t have words for. Something was shifting on a level deeper than mere annoyance, and its disruptive effects felt far more perilous than the usual surface-level static we expect from day-to-day life. It felt like something disintegrating, decohering: as though the floor was crumbling under my feet, and my body was just beginning to realize it was falling. I felt the story of my life being compromised in some fuzzy way I couldn’t articulate. The matter of my world seemed to be sublimating into thin air. Does that even make sense? It didn’t at the time.

Whatever it was, this deep distraction seemed to have the exact opposite effect of the one technology is supposed to have on our lives. More and more, I found myself asking the question, “What was all this technology supposed to be doing for me?”

Think for a moment about the goals you have for yourself: your goals for reading this book, for later today, for this week, even for later this year and beyond. If you’re like most people, they’re probably goals like “learn how to play piano,” “spend more time with family,” “plan that trip I’ve been meaning to take,” and so on. These are real goals, human goals. They’re the kinds of goals that, when we’re on our deathbeds, we’ll regret not having achieved. If technology is for anything, it’s for helping us pursue these kinds of goals.

A few years ago I read an article called “Regrets of the Dying.”2 It was about a businesswoman whose disillusionment with the day-to-day slog of her trade had led her to leave it, and to start working in a very different place: in rooms where people were dying. She spent her days attending to their needs and listening to their regrets, and she recorded the most common things they wished they’d done, or hadn’t done, in life: they’d worked too hard, they hadn’t told people how they felt, they hadn’t let themselves be happy, and so on. This, it seems to me, is the proper perspective – the one that’s truly our own, if any really is. It’s the perspective that our screens and machines ought to help us circle back on, again and again: because whatever we might choose to want, nobody chooses to want to regret.

Think back on your goals from a moment ago. Now try to imagine what your technologies’ goals are for you. What do you think they are? I don’t mean the companies’ mission statements and high-flying marketing messages – I mean the goals on the dashboards in their product design meetings, the metrics they’re using to define what success means for your life. How likely do you think it is that they reflect the goals you have for yourself?

Not very likely, sorry to say. Instead of your goals, success from their perspective is usually defined in the form of low-level “engagement” goals, as they’re often called. These include things like maximizing the amount of time you spend with their product, keeping you clicking or tapping or scrolling as much as possible, or showing you as many pages or ads as they can. A peculiar quirk of the technology industry is its ability to drain words of their deeper meanings; “engagement” is one such word. (Incidentally, it’s fitting that this term can also refer to clashes between armies: here, the “engagement” is fundamentally adversarial as well.)

But these “engagement” goals are petty, subhuman goals. No person has these goals for themselves. No one wakes up in the morning and asks, “How much time can I possibly spend using social media today?” (If there is someone like that, I’d love to meet them and understand their mind.)

What this means, though, is that there’s a deep misalignment between the goals we have for ourselves and the goals our technologies have for us. This seems to me to be a really big deal, and one that nobody talks about nearly enough. We trust these technologies to be companion systems for our lives: we trust them to help us do the things we want to do, to become the people we want to be.

In a sense, our information technologies ought to be GPSes for our lives. (Sure, there are times when we don’t know exactly where we want to go in life. But in those cases, technology’s job is to help us figure out what our destination is, and to do so in the way we want to figure it out.) But imagine if your actual GPS was adversarial against you in this way. Imagine that you’ve just purchased a new one, installed it in your car, and on the first use it guides you efficiently to the right place. On the second trip, however, it takes you to an address several streets away from your intended destination. It’s probably just a random glitch, you think, or maybe it needs a map update. So you give it little thought. But on the third trip, you’re shocked when you find yourself miles away from your desired endpoint, which is now on the opposite side of town. These errors continue to mount, and they frustrate you so much that you give up and decide to return home. But then, when you enter your home address, the system gives you a route that would have you drive for hours and end up in a totally different city.

Any reasonable person would consider this GPS faulty and return it to the store, if not chuck it out their car window. Who would continue to put up with a device they knew would take them somewhere other than where they wanted to go? What reasons could anyone possibly have for continuing to tolerate such a thing?

No one would put up with this sort of distraction from a technology that directs them through physical space. Yet we do precisely this, on a daily basis, when it comes to the technologies that direct us through informational space. We have a curiously high tolerance for poor navigability when it comes to the GPSes for our lives – the information and communication systems that now direct so much of our thought and action.

When I looked around the technology industry, I began to see with new eyes the dashboards, the metrics, and the goals that were driving much of its design. These were the destinations we were entering into the GPSes guiding the lives of millions of human beings. I tried imagining my life reflected in the primary color numbers incrementing on screens around me: Number of Views, Time on Site, Number of Clicks, Total Conversions. Suddenly, these goals seemed petty and perverse. They were not my goals – or anyone else’s.

I soon came to understand that the cause in which I’d been conscripted wasn’t the organization of information at all, but of attention. The technology industry wasn’t designing products; it was designing users. These magical, general-purpose systems weren’t neutral “tools”; they were purpose-driven navigation systems guiding the lives of flesh-and-blood humans. They were extensions of our attention. The Canadian media theorist Harold Innis once said that his entire career’s work proceeded from the question, “Why do we attend to the things to which we attend?”3 I realized that I’d been woefully negligent in asking this question about my own attention.

But I also knew this wasn’t just about me – my deep distractions, my frustrated goals. Because when most people in society use your product, you aren’t just designing users; you’re designing society. But if all of society were to become as distracted in this new, deep way as I was starting to feel, what would that mean? What would be the implications for our shared interests, our common purposes, our collective identities, our politics?

In 1985 the educator and media critic Neil Postman wrote Amusing Ourselves to Death, a book that’s become more relevant and prescient with each passing day.4 In its foreword, Postman recalls Aldous Huxley’s observation from Brave New World Revisited that the defenders of freedom in his time had “failed to take into account … man’s almost infinite appetite for distractions.”5 Postman contrasts the indirect, persuasive threats to human freedom that Huxley warns about in Brave New World with the direct, coercive sort of threats on which George Orwell focuses in Nineteen Eighty-Four. Huxley’s foresight, Postman writes, lay in his prediction that freedom’s nastiest adversaries in the years to come would emerge not from the things we fear, but from the things that give us pleasure: it’s not the prospect of a “boot stamping on a human face – forever” that should keep us up at night, but rather the specter of a situation in which “people will come to love their oppression, to adore the technologies that undo their capacities to think.”6 A thumb scrolling through an infinite feed, forever.

I wondered whether, in the design of digital technologies, we’d made the same mistake as Huxley’s contemporaries: I wondered whether we’d failed to take into account our “almost infinite appetite for distractions.” I didn’t know the answer, but I felt the question required urgent, focused attention.

Save book to Kindle

To save this book to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×