There are awards for the year’s best films but not for its best TikTok videos. That’s too bad, since 2024 yielded several tiny masterpieces. From @yojairyjaimee, a flawless, minute-long re-creation of some bizarre 2009 stage patter by Kanye West (who now goes by Ye). From @accountwashackedwith50m, twelve seconds of chocolate-covered strawberries, filmed from the vantage of a saxophonist in an R. & B. band. From @notkenna, seven seconds of a dog made to look, with preposterously low-budget effects, as if it were flying on a broomstick. Such Internet gems are what the poet Patricia Lockwood has called “the sapphires of the instant”; each catches the light in a strange, hypnotic way.
Just don’t stare too long. If every video is a starburst of expression, an extended TikTok session is fireworks in your face for hours. That can’t be healthy, can it? In 2010, the technology writer Nicholas Carr presciently raised this concern in “The Shallows: What the Internet Is Doing to Our Brains,” a Pulitzer Prize finalist. “What the Net seems to be doing,” Carr wrote, “is chipping away my capacity for concentration and contemplation.” He recounted his increased difficulty reading longer works. He wrote of a highly accomplished philosophy student—indeed, a Rhodes Scholar—who didn’t read books at all but gleaned what he could from Google. That student, Carr ominously asserted, “seems more the rule than the exception.”
Carr set off an avalanche. Much read works about our ruined attention include Nir Eyal’s “Indistractable,” Johann Hari’s “Stolen Focus,” Cal Newport’s “Deep Work,” and Jenny Odell’s “How to Do Nothing.” Carr himself has a new book, “Superbloom,” about not only distraction but all the psychological harms of the Internet. We’ve suffered a “fragmentation of consciousness,” Carr writes, our world having been “rendered incomprehensible by information.”
Read one of these books and you’re unnerved. But read two more and the skeptical imp within you awakens. Haven’t critics freaked out about the brain-scrambling power of everything from pianofortes to brightly colored posters? Isn’t there, in fact, a long section in Plato’s Phaedrus in which Socrates argues that writing will wreck people’s memories?
I’m particularly fond of a hand-wringing essay by Nathaniel Hawthorne, from 1843. Hawthorne warns of the arrival of a technology so powerful that those born after it will lose the capacity for mature conversation. They will seek separate corners rather than common spaces, he prophesies. Their discussions will devolve into acrid debates, and “all mortal intercourse” will be “chilled with a fatal frost.” Hawthorne’s worry? The replacement of the open fireplace by the iron stove.
It’s true that we’ve raised alarms over things that in retrospect seem mild, the Carr-hort responds, but how much solace should we take in that? Today’s digital forms are obviously more addictive than their predecessors. You can even read previous grumbling as a measure of how bad things have become. Perhaps critics were correct to see danger in, say, television. If it now appears benign, that just shows how much worse current media is.
It’s been fifteen years since Carr’s “The Shallows.” Now we have what is perhaps the most sophisticated contribution to the genre, “The Sirens’ Call,” by Chris Hayes, an MSNBC anchor. Hayes acknowledges the long history of such panics. Some seem laughable in hindsight, he concedes, like one in the nineteen-fifties about comic books. Yet others seem prophetic, like the early warnings about smoking. “Is the development of a global, ubiquitous, chronically connected social media world more like comic books or cigarettes?” Hayes asks.
Great question. If we take the skeptics seriously, how much of the catastrophist’s argument stands? Enough, Hayes feels, that we should be gravely concerned. “We have a country full of megaphones, a crushing wall of sound, the swirling lights of a 24/7 casino blinking at us, all part of a system minutely engineered to take our attention away from us for profit,” he writes. Thinking clearly and conversing reasonably under these conditions is “like trying to meditate in a strip club.” The case he makes is thoughtful, informed, and disquieting. But is it convincing?
History is littered with lamentations about distraction. Swirling lights and strippers are not a new problem. What’s important to note about bygone debates on the subject, though, is that they truly were debates. Not everyone felt the sky was falling, and the dissenters raised pertinent questions. Is it, in fact, good to pay attention? Whose purposes does it serve?
Such questions came up in the eighteenth century with the rise of a disruptive new commodity: the novel. Although today’s critics rue our inability to get through long novels, such books were once widely regarded as the intellectual equivalent of junk food. “They fix attention so deeply, and afford so lively a pleasure, that the mind, once accustomed to them, cannot submit to the painful task of serious study,” the Anglican priest Vicesimus Knox complained. Thomas Jefferson warned that once readers fell under the spell of novels—“this mass of trash”—they would lose patience for “wholsome reading.” They’d suffer from “bloated imagination, sickly judgement, and disgust toward all the real business of life.”
Popular writers took a different view, as the English professor Natalie M. Phillips explains in her book “Distraction.” They wondered if unstraying attention was healthy. Maybe the mind required a little leaping around to do its work. “The Rambler” (1750-52) and “The Idler” (1758-60), two essay series by Samuel Johnson, exulted in such mental wandering. Johnson was constantly picking up books and just as constantly putting them down. When a friend asked whether Johnson had actually finished a book he claimed to have “looked into,” he replied, “No, Sir, do you read books through?”
As the mascot of multifocality, Phillips presents Tristram Shandy, the hero of Laurence Sterne’s “The Life and Opinions of Tristram Shandy, Gentleman,” published between 1759 and 1767. The novel starts with Tristram’s conception. His mother’s sudden interjection—“Pray, my dear, have you not forgot to wind up the clock?”—at the moment of his father’s sexual climax leaves Tristram congenitally scatterbrained. Even his name is the product of broken attention. It was supposed to be Trismegistus, but the maid tasked with telling the curate got distracted and forgot all but the first syllable. Tristram relates this tale of woe in a tangle of digressions, punctuated with breathless dashes.
In nine distracted volumes, Tristram never manages to narrate his life. Yet readers found his rollicking thoughts captivating. Perhaps they also found them liberating, Phillips suggests, given the tendency of traditional authorities to demand unwavering focus. “What is requisite for joining in prayer in a right manner?” a widely used Anglican catechism asked. “Close attention without wandering.”
Samuel Johnson’s dictionary noted that “to attend” had multiple meanings. The first, to focus on, was related to the second—to wait on, as a servant. A recent history of attention in the nineteenth-century United States, Caleb Smith’s “Thoreau’s Axe,” draws out this point clearly. Across centuries, thinkers have sought to fend off distraction. But the loudest calls to attention have been directed toward subordinates, schoolchildren, and women. “Atten-TION!” military commanders shout at their men to get them to stand straight. The arts of attention are a form of self-discipline, but they’re also ways to discipline others.
By the nineteenth century, some had grown wary of the intense forms of concentration that industrial life demanded. The psychiatrist Jean-Étienne Dominique Esquirol introduced a new diagnosis, “monomania,” which was doled out with the faddishness that A.D.H.D. is today. Esquirol felt it to be the characteristic disorder of modernity. Herman Melville made it central to “Moby-Dick,” in which Captain Ahab’s fixation on a white whale brings ruin. Hypnosis, an intense form of focus, became an object of widespread concern.
It was Paul Lafargue, Karl Marx’s Cuban-born son-in-law, who rolled this trepidation about attention into a political program. (His essays have been reissued recently by New York Review Books.) Focussing on one’s work and suppressing one’s natural instincts, Lafargue argued, in the eighteen-eighties, was no virtue. It was, rather, to “play the part of the machine” on behalf of one’s own oppressors. Revolutionary consciousness meant asserting “the right to be lazy,” Lafargue insisted. Workers of the world, relax.
One daydreams of a Lafarguean resistance, in which the youth are recruited with samizdat copies of “Tristram Shandy.” But would they read it? I assign my college students about half of what I was assigned as an undergraduate twenty-odd years ago, and many professors have felt the need for similar scaling back. “I have been teaching in small liberal arts colleges for over 15 years now, and in the past five years, it’s as though someone flipped a switch,” the theologian Adam Kotsko writes. “Students are intimidated by anything over 10 pages and seem to walk away from readings of as little as 20 pages with no real understanding.”
Whatever thoughts past writers have had about the virtues of attention, pessimists would argue that the problem is different now. It’s as if we’re not reading books so much as the books are reading us. TikTok is particularly adept at this; you just scroll and the app learns—from your behavior, plus perhaps other information harvested from your phone—about what will keep you hooked. “I wake up in cold sweats every so often thinking, What did we bring to the world?” Tony Fadell, a co-developer of the iPhone, has said.
As a baseline, Chris Hayes points to Abraham Lincoln’s debates with Stephen A. Douglas, in the eighteen-fifties: three-hour exchanges of orations about a momentous topic, slavery. He marvels at how complex and layered the speeches were, stuffed with “parenthetical and nested clauses, with ideas that are previewed at the beginning of a sentence, left for a bit, and then returned to later.” He imagines what “sheer stamina of focus” Lincoln and Douglas’s audiences must have possessed.
Those audiences were large. Would voters flock to something similar today? Not likely, Hayes says. Information now comes in “ever-shorter little bites,” and “focus is harder and harder to sustain.” Hayes has seen this firsthand. His illuminating backstage account of cable news describes thoughtful journalists debasing themselves in their scramble to retain straying viewers. Garish graphics, loud voices, quick topic changes, and titillating stories—it’s like jangling keys to lure a dog. The more viewers get their news from apps, the harder television producers have to shake those keys.
This situation is, in some sense, our fault, as the whole system runs on our own choices. But those choices don’t always feel free. Hayes distinguishes between voluntary and compelled attention. Some things we focus on by choice; others, because of our psychological hardwiring, we find hard to ignore. Digital tools let online platforms harness the latter, addressing our involuntary impulses rather than our higher-order desires. The algorithms deliver what we want but not, as the late philosopher Harry Frankfurt put it, “what we want to want.”
Getting what we want, not what we want to want: it could be the slogan of our times. Hayes notes that it’s not only corporations that home in on our baser instincts. Since social-media users also have access to immediate feedback, they learn what draws eyeballs, too. Years ago, Donald Trump, Elon Musk, and Kanye West had hardly anything in common. Now their pursuit of publicity has morphed them into versions of the same persona—the attention troll. And, despite ourselves, we can’t look away.
The painful twist is that climate change, the thing we really ought to focus on, “evades our attentional faculties,” Hayes writes. “It’s always been a problem,” the writer and activist Bill McKibben told him, “that the most dangerous thing on the planet is invisible, odorless, tasteless, and doesn’t actually do anything to you directly.” Global warming is the opposite of Kanye West: we want to pay attention but we don’t.
The trouble is “attention capitalism,” Hayes argues, and it has the same dehumanizing effect on consumers’ psyches as industrial capitalism has on workers’ bodies. Successful attention capitalists don’t hold our attention with compelling material but, instead, snatch it over and over with slot-machine gimmicks. They treat us as eyeballs rather than individuals, “cracking into our minds” and leaving us twitching. “Our dominion over our own minds has been punctured,” Hayes writes. “The scale of transformation we’re experiencing is far more vast and more intimate than even the most panicked critics have understood.”
What’s awkward about this whole debate is that, though we speak freely of “attention spans,” they are not the sort of thing that psychologists can measure, independent of context, across time. And studies of the ostensible harm that carrying smartphones does to cognitive abilities have been contradictory and inconclusive. A.D.H.D. diagnoses abound, but is that because the condition is growing more prevalent or the diagnosis is? U.S. labor productivity and the percentage of the population with four years or more of college have risen throughout the Internet era.
The apparent decline of reading is also not so straightforward. Print book sales are holding steady, and audiobook sales are rising. The National Center for Education Statistics has tracked a recent drop in U.S. children’s reading abilities, yet that mostly coincides with the pandemic, and scores are still as good as or better than when the center started measuring, in 1971. If reading assignments at top colleges are shorter, it might be because today’s hypercompetitive students are busier, rather than because they’re less capable (and how many were actually doing all the reading in the old days?). What about Nicholas Carr’s insistence in 2010 that a Rhodes Scholar who didn’t read books heralded a post-literate future? “Of course I read books!” that Rhodes Scholar protested to another writer. Today, he holds a Ph.D. from Oxford and has written two books of his own.
After decades of the Internet, the mediascape has still not dissolved into a froth of three-second clips of orgasms, kittens, and trampoline accidents, interspersed with sports-betting ads. As the legal scholar Tim Wu argues in “The Attention Merchants,” the road to distraction is not one-way. Yes, businesses seize our attention using the shiniest lures available, but people become inured and learn to ignore them. Or they recoil, which might explain why meditation, bird-watching, and vinyl records are in vogue. Technology firms, in fact, often attract users by promising to reduce distractions, not only the daily hassles—paying bills, arranging travel—but the online onslaught, too. Google’s text ads and mail filters offered respite from the early Internet’s spam and pop-ups. Apple became one of the world’s largest companies by selling simplicity.
Besides, distraction is relative: to be distracted from one thing is to attend to another. And any argument that people are becoming distracted must deal with the plain fact that many spend hours staring intently at their screens. What is doomscrolling if not avid reading? If people are failing to focus in some places, they’re clearly succeeding in others.
One place they’re succeeding is cinema, which is in a baroque phase. A leading Golden Globe winner this year, “The Brutalist,” exceeds three and a half hours. The average length of a Top Ten grossing film grew by more than twenty minutes between 1993 and 2023. Hollywood’s reliance on sequels and recycled intellectual property—we’re a hair’s breadth from a crossover in which Thor fights the Little Mermaid—may have been terrible for cinema. It has, however, made for complicated movies tightly packed with backstory and fan service.
The same goes for narrative television. It was once entertainment for the inattentive, with simple plots, broad jokes, and a tropical bird interrupting to shout about Froot Loops. Yet that changed with cable, DVDs, and streaming shows (the first hit streaming series, Netflix’s “House of Cards,” débuted in 2013). As writers stopped worrying about viewers losing the thread, their shows started resembling ultra-long films. Viewers responded by binge-watching, taking in hours of material in what Vince Gilligan, who created “Breaking Bad,” has called “a giant inhalation.”
Or consider video games, which have grown mercilessly long. Years ago, in these pages, Alex Ross described Richard Wagner’s “Ring of the Nibelung,” a cycle of four operas spanning about fifteen hours, as, “arguably, the most ambitious work of art ever attempted” and “unlikely to have future rivals.” In 2023, Larian Studios swept the video-game awards with Baldur’s Gate 3, a noticeably Wagnerian affair with rival gods, magic rings, enchanted swords, and dragons. Two hundred and forty-eight actors and some four hundred developers worked on it. Playing through Baldur’s Gate 3, an unhurried, turn-based game with complex rules, can easily take seventy-five hours, or five “Ring” cycles (and more than twice that if you’re a completist). All the same, it has sold some fifteen million copies.
Even the supposedly attention-pulverizing TikTok deserves another look. Hayes, who works in TV, treats TikTok wholly as something to watch—an algorithmically individualized idiot box. But TikTok is participatory: more than half its U.S. adult users have posted videos. Where the platform excels is not in slick content but in amateur enthusiasm, which often takes the form of trends with endless variations. To join in, TikTokers spend hours preparing elaborate dance moves, costume changes, makeup looks, lip synchs, trick shots, pranks, and trompe-l’oeil camera maneuvers.
What’s going on? The media theorist Neil Verma, in “Narrative Podcasting in an Age of Obsession,” describes the era of TikTok’s rise as beset by “obsession culture.” Online media, by broadening the scope of possible interests, have given rise to an unabashedly nerdy intellectual style. Verma focusses on the breakout podcast “Serial,” whose first season, in 2014, followed the host for hours as she pored over the details of a fifteen-year-old murder case. But deep dives into niche topics have become the norm. The wildly popular podcaster Joe Rogan runs marathon interviews, some exceeding four hours, on ancient civilizations, cosmology, and mixed martial arts. A four-hour video of the YouTuber Jenny Nicholson dissecting the design flaws of a defunct Disney World hotel has eleven million views (deservedly: it’s terrific). Hayes himself confesses to spending hours “utterly transfixed” by watching old carpets being shampooed.
Are we, in staring at carpets, ignoring weighty political matters? Hayes makes much of the Lincoln-Douglas debates, but the pair spoke without microphones to boisterous crowds numbering in the thousands, so it’s highly unlikely that their audiences followed every word. (The events included flowing alcohol.) It’s also hard to admire the moral seriousness of a debate about slavery, held on the eve of the Civil War, in which neither side proposed abolishing it. If the history of totalitarianism teaches anything, it’s that long-winded orations do not always signify political health.
Anyway, political verbosity, as measured by State of the Union addresses, has risen during the twenty-first century. Donald Trump once spoke to CPAC for more than two hours. Famously, his digressive speeches require deep immersion in right-wing lore to comprehend. “I’ll talk about, like, nine different things, and they all come back brilliantly together,” Trump has boasted. The linguist John McWhorter has said, of Trump’s convoluted style, that “you have to almost parse it as if it was something in the Talmud.”
We blame the Internet for polarizing politics and shredding attention spans, but those tendencies actually pull in opposite directions. What’s true of culture is true of politics, too: as people diverge from the mainstream, they become obsessional and prone to scrambling down rabbit holes. Following QAnon takes the sort of born-again devotion that one expects of a K-pop fan. Democratic Socialists, vaccine skeptics, anti-Zionists, manosphere alphas—these are not people known for casual political engagement. Some may be misinformed, but they’re not uninformed: “Do your own research” is the mantra of the political periphery. Fragmentation, it turns out, yields subcultural depths. Silos are not shallows.
Hayes worries that the Internet’s political enthusiasms distract from global warming. And yet, conspicuously, it is young people, the most online of us all, who are leading the charge against climate change. The Gen Z activist Greta Thunberg is so good at publicizing the issue that media scholars write of a “Greta effect.” She’s been raising hell online since age fifteen.
If people aren’t losing focus or growing complacent, what’s the panic about? Complaints about distraction are most audible from members of the knowledge class—journalists, artists, novelists, professors. Such people must summon creativity in long, unsupervised stretches, and so they are particularly vulnerable to online interruptions. Instagram vexes them in a way that it might not vex home health aides, retail salespeople, or fast-food employees, to name the three most common types of U.S. workers.
A larger part of the knowledge-class problem is that cultural creators, especially those in legacy media, fear that smartphones will lure their audiences away. In this, they don’t seem vastly different from eighteenth-century priests decrying novels for turning women away from prayerful obedience. Is the ostensible crisis of attention, at bottom, a crisis of authority? Is “People aren’t paying attention” just a dressed-up version of “People aren’t paying attention to me”?
The suspicion that all this is élite anxiety in the face of a democratizing mediascape deepens when you consider what the attentionistas want people to focus on. Generally, it’s fine art, old books, or untrammelled nature—as if they were running a Connecticut boarding school. Above all, they demand patience, the inclination to stick with things that aren’t immediately compelling or comprehensible. Patience is indeed a virtue, but a whiff of narcissism arises when commentators extoll it in others, like a husband praising an adoring wife. It places the responsibility for communication on listeners, giving speakers license to be overlong, unclear, or self-indulgent. When someone calls for audiences to be more patient, I instinctively think, Alternatively, you could be less boring.
In a sense, what attention alarmists seek is protection from a competition that they’re losing. Fair enough; the market doesn’t always deliver great results, and Hayes is right to deplore the commodification of intellectual life. But one can wonder whether ideas are less warped by the market when they are posted online to a free platform than when they are rolled into books, given bar codes, and sold in stores. It’s worth remembering that those long nineteenth-century novels we’re losing the patience to read were long for a reason: profit-seeking publishers made authors drag out their stories across multiple volumes. Market forces have been stretching, squashing, spinning, and suppressing ideas for centuries. Realistically, the choice isn’t commodified versus free but which commodity form suits best.
For Hayes, what makes the apps awful is that they operate without consent. They seize attention using tricks, leaving us helpless and stupefied. Yet even this argument, his most powerful, warrants caution. Our media have always done a weird dance with our desires. Although Hayes argues for the profound novelty of our predicament, the title of his book, “The Sirens’ Call,” alludes to a Homeric tale from antiquity, of songs too alluring to resist. This isn’t always unwelcome. Consider our highest words of praise for books—captivating, commanding, riveting, absorbing, enthralling. It’s a fantasy of surrendered agency. (“A page-turner”: the pages turn themselves.) Oddly, the thing we deplore in others, submission, is what we most want for ourselves.
The nightmare the alarmists conjure is of a TikTok-addled screen-ager. This isn’t a full picture of the present, though, and it might not reveal much about the future, either. Ours is an era of obsession as much as distraction, of long forms as much as short ones, of zeal as much as indifference. To ascribe our woes to a society-wide attention-deficit disorder is to make the wrong diagnosis.
Which is unfortunate, because our relationships to our smartphones are far from healthy. The mediascape is becoming a stormy sea of anxiety, envy, delusion, and rage. Our attention is being redirected in surprising and often worrying ways. The overheating of discourse, the rise of conspiratorial thinking, the hollowing out of shared truths: all these trends are real and deserve careful thought. The panic over lost attention is, however, a distraction. ♦
Comments
Leave a Comment