Skip to main content Accessibility help
×
Hostname: page-component-8448b6f56d-mp689 Total loading time: 0 Render date: 2024-04-19T07:13:47.333Z Has data issue: false hasContentIssue false

8 - The Starlight

from II - Clicks against Humanity

Published online by Cambridge University Press:  30 May 2018

James Williams
Affiliation:
University of Oxford

Summary

Type
Chapter
Information
Stand out of our Light
Freedom and Resistance in the Attention Economy
, pp. 55 - 67
Publisher: Cambridge University Press
Print publication year: 2018
Creative Commons
Creative Common License - CCCreative Common License - BYCreative Common License - NC
This content is Open Access and distributed under the terms of the Creative Commons Attribution licence CC-BY-NC 4.0 https://creativecommons.org/cclicenses/

[Donald Trump’s candidacy] may not be good for America, but it’s damn good for CBS.

Les Moonves (CBS Chairman/CEO), February 2016

Around the time I started feeling existentially compromised by the deep distractions collecting in my life, I developed a habit that quickly became annoying to everyone around me. It went like this: I’d hear someone use a phrase to describe me that had a certain ring to it, like it would make a good title for something – but its content was both specific and odd enough that if it were used as the title for a biography about my whole life, it would be utterly absurd. Whenever I’d hear a phrase like that, I’d repeat it with the gravitas of a movie-trailer announcer, and then follow it with the phrase: “The James Williams Story.”

Here’s an example. One day, after a long conversation with my wife, she said to me, “You’re, like, my receptacle of secrets.” To which I replied: “Receptacle of Secrets: The James Williams Story.” The joke being, of course, that choosing this one random, specific snapshot of my life to represent the narrative of my entire existence – an existence which has involved many achievements more notable than hearing and keeping the odd spousal secret – would be an absurd and arbitrary thing to do. I eventually came to understand (or perhaps rationalize) this habit as a playful, shorthand way of stabilizing what philosophers would call my “diachronic self,” or the self over time, over the increasingly rocky waves of my “synchronic self,” or the self at a given moment. I might have been overanalyzing it, but I interpreted this emergent habit as a way of pushing back against my immediate environment’s ability to define me. It was a way of saying, “I will not be so easily summarized!” It was a way of trying to hold onto my story by calling attention to what my story definitely was not.

We experience our identities as stories, according to a line of thought known as “narrative identity theory.”1 In his book Neuroethics, Neil Levy writes that both synchronic and diachronic unity are essential for helping us maintain the integrity of these stories: “We want to live a life that expresses our central values, and we want that life to make narrative sense: we want to be able to tell ourselves and others a story, which explains where we come from, how we got to where we are, and where we are going” (p. 201).

When we lose the story of our identities, whether on individual or collective levels, it undermines what we could call the “starlight” of our attention, or our ability to navigate “by the stars” of our higher values or “being goals.” When our “starlight” is obscured, it makes it harder to “be who we want to be.” We feel the self fragmenting and dividing, resulting in an existential sort of distraction. William James wrote that “our self-feeling in this world depends entirely on what we back ourselves to be and do.” When we become aware that our actual habits are in dissonance with our desired values, this self-feeling often feels like a challenge to, if not the loss of, our identities.

This obscured “starlight” was a deeper layer of the distractions I’d been feeling, and I felt that the attention-grabby techniques of technology design were playing a nontrivial role. I began to realize that my technologies were enabling habits in my life that led my actions over time to diverge from the identity and values by which I wanted to live. It wasn’t just that my life’s GPS was guiding me into the occasional wrong turn, but rather that it had programmed me a new destination in a far-off place that it did not behoove me to visit. It was a place that valued short-term over long-term rewards, simple over complex pleasures. It felt like I was back in my high-school calculus class, and all these new technologies were souped-up versions of Tetris. It wasn’t just that my tasks and goals were giving way to theirs – my values were as well.

One way I saw the “starlight” getting obscured in myself and others, in both the personal and political domains, was in the proliferation of pettiness. Pettiness means pursuing a low-level goal as though it were a higher, intrinsically valuable one. Low-level goals tend to be short-term goals; where this is so, pettiness may be viewed as a kind of imprudence. In The Theory of Moral Sentiments, Adam Smith calls prudence the virtue that’s “most useful to the individual.” For Smith, prudence involves the union of two things: (1) our capacity for “discerning the remote consequences of all our actions,” and (2) “self-command, by which we are enabled to abstain from present pleasure or to endure present pain, in order to obtain a greater pleasure or to avoid a greater pain in some future time.”

In my own life I saw this pettiness, this imprudence, manifesting in the way the social comparison dynamics of social media platforms had trained me to prioritize mere “likes” or “favorites,” or to get as many “friends” or “connections” as possible, over pursuing other more meaningful relational aims. These dynamics had made me more competitive for other people’s attention and affirmation than I ever remember being: I found myself spending more and more time trying to come up with clever things to say in my social posts, not because I felt they were things worth saying but because I had come to value these attentional signals for their own sake. Social interaction had become a numbers game for me, and I was focused on “winning” – even though I had no idea what winning looked like. I just knew that the more of these rewarding little social validations I got, the more of them I wanted. I was hooked.

The creators of these mechanisms didn’t necessarily intend to make me, or us, into petty people. The creator of the Facebook “like” button, for instance, initially intended for it to send “little bits of positivity” to people.2 If its design had been steered in the right way, perhaps it might have done so. However, soon enough the “like” function began to serve the data-collection and engagement-maximizing interests of advertisers. As a result, the metrics that comprised the “score” of my social game – and I, as the player of that game – were directly serving the interests of the attention economy. In the pettiness of my day-to-day number-chasing, I had lost the higher view of who I really was, or why I wanted to communicate with all these people in the first place.

Pettiness is not exactly a rare phenomenon in the political domain. However, during the 2016 US presidential election I encountered a highly moralized variant of pettiness coming from people I would have never expected to see it in. Over the course of just a few months, I witnessed several acquaintances back in Texas – good, loving people, and deeply religious “values voters” – go from vocally rejecting one particular candidate as being morally reprehensible and utterly unacceptable, to ultimately setting aside those foundational moral commitments in the name of securing a short-term political win. By the time a video emerged of the candidate bragging about committing sexual assault, this petty overwriting of moral commitment with political expediency was so total as to render this staggering development barely shrug-worthy. By then, their posts on social media were saying things like, “I care more about what Hillary did than what Trump said!”

In the 2016 presidential election campaign, Donald Trump took the dominance of pettiness over prudence to new heights. Trump is very straightforwardly an embodiment of the dynamics of clickbait: he’s the logical product (though not endpoint) in the political domain of a petty media environment defined by impulsivity and zero-sum competition for our attention. One analyst has estimated that Trump is worth $2 billion to Twitter, which amounts to almost one-fifth of the company’s current value.3 His success metrics – number of rally attendees, number of retweets – are attention economy metrics. Given this, it’s remarkable how consistently societal discussion has completely misread him by casting him in informational, rather than attentional, terms. Like clickbait or so-called “fake news,” the design goal of Trump is not to inform but to induce. Content is incidental to effect.

At its extreme, this pettiness can manifest as narcissism, a preoccupation with being recognized by others, valuing attention for its own sake, and the prioritization of fame as a core value. A meta-analysis of fifty-seven studies found that social media in particular is linked with increased narcissism.4 Another study found that young people are now getting more plastic surgery due to pressure from social media.5 And a study of children’s television shows in recent years found that, rather than pro-social community values, the main value now held up by children’s television shows as being most worth pursuing is fame.6 In his historical study of fame The Frenzy of Renown, Leo Braudy writes that when we call someone “famous,” what we’re fundamentally saying is, “pay attention to this.” So it’s entirely to be expected that in an age of information abundance and attention scarcity we would see an increased reliance on fame as a heuristic for determining what and who matters (i.e. merits our attention), as well as an increased desire for achieving fame in one’s own lifetime (as opposed to a legacy across generations).7

Sometimes the desire for fame can have life-and-death consequences. Countless YouTube personalities walk on the edges of skyscrapers, chug whole bottles of liquor, and perform other dangerous stunts, all for the fame – and the advertising revenue – it might bring them. The results are sometimes tragic. In June 2017 a man concocted an attention-getting YouTube stunt in which he instructed his wife, who was then pregnant with their second child, to shoot a handgun from point-blank range at a thick book he was holding in front of his chest. The bullet ripped through the book and struck and killed him. As the New York Times reported:

It was a preventable death, the sheriff said, apparently fostered by a culture in which money and some degree of stardom can be obtained by those who attract a loyal internet following with their antics.

In the couple’s last video, posted on Monday, Ms. Perez and her boyfriend considered what it would be like to be one of those stars – “when we have 300,000 subscribers.”

“The bigger we get, I’ll be throwing parties,” Mr. Ruiz said. “Why not?”8

Similarly, on the video-game live-streaming site Twitch, a 35-year-old man stayed awake to continue his streaming marathon for so long that he died.9 And in December 2017, Wu Yongning, a Chinese man known as a “rooftopper” – someone who dangles from skyscrapers without safety equipment in order to post and monetize the video online – fell to his death. As one user on the Chinese microblogging service Weibo reflected about the role, and responsibility, of the man’s approving audience members:

Watching him and praising him was akin to … buying a knife for someone who wanted to stab himself, or encouraging someone who wants to jump off a building. … Don’t click “like,” don’t click “follow.” This is the least we can do to try to save someone’s life.10

There’s nothing wrong with wanting attention from other people. Indeed, it’s only human. Receiving the attention of others is a necessary, and often quite meaningful, part of human life. In fact, Adam Smith argues in Wealth of Nations that it’s the main reason we pursue wealth in the first place: “To be attended to, to be taken notice of with sympathy, complacency, and approbation,” he writes, “are all the advantages which we can propose to derive from it.” It’s this approval, this regard from others, he says, that leads people to pursue wealth – and when they do attain wealth, and then “expend it,” it’s that expenditure – what we might call the exchange of monetary wealth for attentional, or reputational, wealth – that Smith describes as being “led by an invisible hand.”11 So, on a certain reading, one could argue that all economies are ultimately economies of attention. However, this doesn’t mean that all attention is worth receiving, or that all ways of pursuing it are praiseworthy.

We can also see the obscuring of our starlight in the erosion of our sense of the nature and importance of our higher values. In Mike Judge’s film Idiocracy, a man awakes from cryogenic slumber in a distant future where everyone has become markedly stupider. At one point in the story he visits a shambolic Costco warehouse store, where a glazed-eyed front-door greeter welcomes him by mechanically droning, “Welcome to Costco. I love you.” This is an extreme example of the dilution of a higher value – in this case, love. In the design of digital technologies, persuasive goals often go by names that sound lofty and virtuous but have been similarly diluted: “relevance,” “engagement,” “smart,” and so on. Designing users’ lives toward diluted values leads to the dilution of their own values at both individual and collective levels.

Consider that across many liberal democracies the percentage of people who say it’s “essential” to live in a democracy has in recent years been in freefall. The “starlight” of democratic values seems to be dimming across diverse cultures, languages, and economic situations. However, one of the few factors these countries do have in common is their dominant form of media, which just happens to be the largest, most standardized, and most centralized form of attentional control in human history, and which also happens to distract from our “starlight” by design.

Similarly, in the last two decades the percentage of Americans who approve of military rule (saying it would be either “good” or “very good”) has doubled, according to the World Values Survey, to now being one in six people.12 The authors of a noted study on this topic point out that this percentage “has risen in most mature democracies, including Germany, Sweden, and the United Kingdom.” Crucially, they also note that this trend can’t be attributed to economic hardship. “Strikingly,” the authors write, “such undemocratic sentiments have risen especially quickly among the wealthy,” and even more so among the young and wealthy. Today, this approval of military rule “is held by 35 percent of rich young Americans.”13

On the part of political representatives, this value dilution manifests as the prioritization of metrics that look very much like attention economy metrics, as well as the placing of party over country. As Rousseau wrote in Political Economy, when a sense of duty is no longer present among political leaders, they simply focus on “fascinating the gaze of those whom they need” in order to stay in power.

Our information and communication technologies serve as mirrors for our identities, and these mirrors can show us either dignified or undignified reflections of ourselves. When we see a life in the mirror that appears to be diverging from the “stars” of freedom and self-authorship by which we want to live, our reaction not only involves the shock of indignity, but also quite often a defensive posture of “reactance.” Reactance refers to the idea “that individuals have certain freedoms with regard to their behavior. If these behavioral freedoms are reduced or threatened with reduction, the individual will be motivationally aroused to regain them.”14 In other words, when we feel our freedom being restricted, we tend to want to fight to get it back.

To take one example of an undignified reflection that prompts this sort of reactance, consider the Facebook “emotional contagion” experiment that Facebook and researchers at Cornell University carried out in 2014. The experiment used the Facebook news feed to identify evidence of social contagion effects (i.e. transference of emotional valence). Over a one-week period, the experiment reduced the number of either positive or negative posts that a sample of around 700,000 Facebook users saw in their News Feed. They found that when users saw fewer negative posts, their own posts had a lower percentage of words that were negative. The same was true for positive posts and positive words. While the effect sizes were very small, the results showed a clear persuasive effect on the emotional content of users’ posts.15

In response, some raised questions about research ethics processes – but many objections were also about the mere fact that Facebook had manipulated its users at all. Clay Johnson, the founder of political marketing firm Blue State Digital, wrote, “the Facebook ‘transmission of anger’ experiment is terrifying.”16 The Atlantic described the study as “Facebook’s Secret Mood Manipulation Experiment.”17 A member of the UK parliament called for an “investigation into how Facebook and other social networks manipulated emotional and psychological responses of users by editing information supplied to them.”18 And privacy activist Lauren Weinstein wrote on Twitter, “I wonder if Facebook KILLED anyone with their emotion manipulation stunt. At their scale and with depressed people out there, it’s possible.”19

We are manipulated by the design of our media all the time. This seems to me simply another way of describing what media is and does. Much, if not most, of the advertising research that occurs behind the closed doors of companies could be described as “secret mood manipulation experiments.” And the investigation the UK parliamentarian called for would effectively mean investigating the design of all digital media that shape our attention in any way whatsoever.

What was unfortunately missed in the outrage cascades about this experiment was the fact that Facebook was finally measuring whether a given design had a positive or negative effect on people’s emotions – something that they don’t appear to have been doing before this time. This is precisely the sort of knowledge that allows the public to say, “We know you can measure this now – so start using it for our benefit!” But that potential response was, as it is so often, ultimately scuppered by the dynamics of the attention economy itself.

If a person were to interpret Facebook’s alteration of their news feed as unacceptable manipulation, and object to the image – the “undignified reflection” – of themselves as someone who is not fully in control of their decisions about what they write in their own posts, then they would see their use of Facebook as incompatible with, and unsupportive of, the ultimate “being goal” they have for themselves. The sense of a precipitous sliding backward from that ultimate goal would, as discussed above, have the effect of undermining that person’s sense of self-integrity, and would thus reduce their sense of dignity.

Finally, when we start to lose the story of our shared identity, it has major implications for politics. We find it harder to keep in view the commonalities we have with others in our own society. We struggle to imagine them inhabiting the same space or demos as us, especially when we’re increasingly physically isolated from them. Division itself is not bad, of course: isolation is necessary for the development of individual views and opinions. Diversity requires division, of a sort. But the sort of division that removes the space in which the common interest and general will may be found is the sort that is extremely problematic.

This erosion of shared identity is often mischaracterized as political “polarization.” However, “polarization” suggests a rational disunity, mere disagreement about political positions or assumptions. In essence, a disunity of ideas. What we have before us, on the other hand, seems a profoundly irrational disunity – a disunity of identity – and indeed a “deep-self discordance” among the body politic. This can lead to collective akrasia, or weakness of will. As the philosopher Charles Taylor writes, “the danger is not actual despotic control but fragmentation – that is, a people increasingly less capable of forming a common purpose and carrying it out.”20 William James, in The Principles of Psychology, writes, “There is no more miserable human being than one in whom nothing is habitual but indecision.”21 Perhaps we could say the same of societies as well.

Rousseau argued that a collective decision can depart from the general will if people are “misled by particular interests … by the influence and persuasiveness of a few clever men.”22 This can, of course, happen via mere functional distraction, or inhibition of the “spotlight,” but Rousseau notes that this control more often happens by subdividing society into groups, which leads them to “abandon” their “membership” of the wider group. At extremes, groups may diverge so much from one another that their insularity becomes self-reinforcing. And when this division of identity becomes moralized in such a way that it leads to a deeper sort of tribalistic delegitimizing, it veers toward a certain kind of populism, which I will discuss in the next chapter.

Here at the level of the “starlight,” however, this division has primarily prompted lamentations about the problems of internet “echo chambers,”23 or self-reinforcing “bubbles of homophily.”24 Yet the echoic metaphor seems to me to miss something essential: while echoes do bounce back, the sound ultimately dissipates. A better metaphor might be amplifier feedback, that is, holding a live microphone up to a speaker to create an instant shrieking loop that will destroy your eardrums if you let it. When the content of that shrieking loop consists of our own identities, whether individually or as groups, the distorted reflection we see in the “mirror” of technology takes on the character of a funhouse mirror, giving us only an absurd parody of ourselves.

Considering the ways my “starlight” was being obscured helped me broaden the scope of “distraction” to include not just frustrations of doing, but also frustrations of being over time. This sort of distraction makes us start to lose the story, at both individual and collective levels. When that happens, we start to grasp for things that feel real, true, or authentic in order to get the story back. We try to reorient our living toward the values and higher goals we want to pursue.

But here, at least, we still know when we’re not living by our chosen stars – we can still in principle detect the errors and correct them. It seemed like there was one deeper level of “distraction” to contend with: the sort of distraction that would threaten our ability to know and define what our goals and values are in the first place.

Save book to Kindle

To save this book to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×