back HAL CROWTHER
Darkness at Dawn: The Campus in Crisis
A recent study of 20,000 young adults, average age twenty, reported that 52 percent were experiencing hopelessness, 70 percent felt “very sad and lonely,” and 37 percent were so depressed they could barely function. Sixty percent felt overwhelming anxiety, 38 percent felt overwhelming anger. Eleven percent had considered suicide and 1.3 percent—some 250 individuals—had actually attempted it. From this profile of a tortured Lost Generation, we imagine a sample from third world countries like El Salvador, where young people driven from their villages by violence and starvation walk hundreds of miles toward “freedom” and end up in American concentration camps. Or young Afghans, whose families cooperated with the invading Americans, now awaiting Taliban retribution when the last Yankee regiment is gone. Syrians, maybe, refugees with their futures destroyed by that dreadful war at the nexus of all the world’s anxiety.
If those were Americans in this survey, surely they were members of disadvantaged minorities, trapped in the violence and drug abuse of abandoned urban ghettoes. Or soon-to-be dispersed kids from fading small towns, high school graduates and dropouts who know that the only jobs they’ll ever find are hundreds of miles away in those frightening cities.
Those would be educated guesses. Guess again. The 20,000 near adults in this study, conducted in 2017 by the American College Health Association, made up the undergraduate student body of the University of North Carolina at Chapel Hill. UNC-CH—alma mater of my daughter and my son-in-law, onetime employer of my wife, the pride of North Carolina’s prideful university system, and one of the most venerable and prestigious public universities in the United States. This storied campus, this oasis of learning guarded for a century by the recently toppled statue of a Confederate soldier called Silent Sam—this epicenter of depression and misery?
These students, these tortured outpatients, are the cream of the crop. They include the most gifted, many of the most affluent, and the most privileged in every respect. Twenty years ago their bright futures, and their faith in them, would have been taken for granted. If more than half of these elite undergraduates feel hopeless and nearly three-quarters are sad and lonely, we can only conclude—tentatively, since this is virgin territory for psychiatrists—that their less fortunate peers feel even worse. A whole generation consumed by anguish? And we’re led to conclude that something new, something devastating and previously unexamined, is going terribly wrong.
Here’s where I skate out onto thin ice. It doesn’t take a reader much research—a quick Wikipedia check—to confirm that I’m a septuagenarian. I graduated from college during the Johnson administration. One pompous comparison between my generation and this one and we have an easily dismissed “Why, back in my day . . .” exercise like the ones I made fun of when I was at the other end of the age gap. It’s too easy to dismiss these troubled undergraduates as pampered crybabies. College students today have plenty to worry about. It’s July as I write this, and climate change is rendering the South, and lots of other places, virtually uninhabitable. Geopolitics are combustible. War, even nuclear war, is in the air, and the current president of the United States is a senile racist sex criminal with an ego the size of a bus and a brain the size of a walnut. White supremacy is coming back into style, the population bomb is ticking, defenders of the environment—our fragile planet—are overwhelmed by corporate predators and their captive politicians.
It’s no picnic looking for a future, one often burdened by scandalous student debt, a specter few of us encountered fifty years ago. The financial inflation in higher education is the result of a vicious conspiracy between the lenders and the schools, and one of the best reasons to distrust or even despise the establishment. But the ’60s, my friend, were no carefree stroll down Fraternity Row. Give us credit. There was a universal draft for a pointless bloody war, there were friends and relatives coming home in body bags. A president was assassinated while I was in college. Just who had him killed, and why, we’ll probably never know. (If you still believe in the insane single shooter with no affiliations, you’re at least incurious if not outright gullible.) In 1968 that president’s brother was murdered, too, and so was Martin Luther King Jr. There was Kent State and Selma, Alabama. Vietnam and civil rights polarized Americans almost as radically as they’re polarized today, over our repulsive president and his racially feral Republican Party.
We had our issues. But do I in any way suspect—I was never considered a Happy Camper, myself—that deep down half of us felt hopeless and two out of five were too distressed to function? Come on. We were twenty, for Christ’s sake. There were so many places we hadn’t seen, books we hadn’t read, flavors we hadn’t tasted, women we hadn’t met. (I speak purely from a male point of view on that, but didn’t women have similar high hopes?) There was all of life ahead, however it might turn out, and much of it looked damned inviting at the outset. And yes, we were the privileged ones too.
I had one college friend who was determined to commit suicide, and he succeeded on his third try. He even preached hopelessness and suicide, over beer in fraternity basements. I liked Bill and considered him brilliant, but his fatal argument was unconvincing. I thought he was mentally ill; I think most of his friends agreed with me. (Another classmate attempted suicide, but he chose to hang himself from a pipe that bent a little under his weight. He was 6’4”, luckily, and when his toes touched the rug he lacked the suicidal will to lift them and strangle. After serving some hospital time, this classmate became a respected businessman and citizen. But he was known to us by the nickname “Too Tall.”)
On campus today, would suicidal Bill be a leader and a prophet, instead of a tragic outlier? Would his determination to destroy himself win him disciples, instead of friends like us who shook our heads when we talked about him, and schemed to distract him from his deadly purpose? Something profound has changed when so many young people, given exclusive opportunities to prosper in one of the world’s wealthiest countries, see only gloom and sorrow on the road ahead.
A lot of things have changed in fifty years, including the reputation of America, once a role model among nations, now more feared and psychoanalyzed than admired. Perceptive people around the globe see us set on a collision course with some of our worst instincts. Who, in the summer of 2019, could argue with that? But the national trajectory is not one of the prime concerns of depressed sophomores. Every survey indicates that their anxiety is infinitely more personal than global. So the question becomes, “What is the most dramatic, what is the defining difference between growing up in the ’60s and growing up in the twenty-first century?”
It’s no mystery. These students are the first Americans born and entirely marinated in the age of digital communication, the age of the Internet and the devices that enable it—most notoriously the cell phone—and of the metastasizing phenomenon of social media. Sven Birkerts, in his prophetic book The Gutenberg Elegies, described their life, now the life of the majority of Americans, as “hive life.” Its consequence, an interconnectedness so complete that privacy becomes a quaint memory, seemed so toxic to Birkerts that his final advice to his readers was “Reject it.” This was in 1994, at the dawn of the technological revolution and the promiscuous sharing that transformed every society and left Birkerts sounding like a voice from the Middle Ages.
To those of us who remember the way it was before, the difference feels vast, intergalactic. A cousin of mine in her teens looked at me suspiciously, as if I might be lying to her, when I admitted that I never made phone calls when I was her age. Cells hadn’t been invented; parents discouraged us from using their home phones to call friends. After a long day of school and sports practices most of us were relieved to be alone or with our families in the evening. Weekends we met in the park, on the ball field or the basketball courts. Hardly ever by prearrangement, either.
Whatever we were deprived of, we didn’t know it or miss it. Who knew that most human beings, and especially younger ones, had been yearning for the technology to unburden themselves of their ill-informed opinions and intimate secrets—all for rapt audiences of voyeurs and exhibitionists like themselves? If life before “hive life” sounds profoundly weird to young people today, take it from me that “trolls,” “Twitter mobs,” “cyberbullying,” and armies of virtual “friends” and “followers” would have sounded like freaky science fiction to the kids we were back then.
They still seem very strange, to me. It took me longer to grasp the addictive power of these gadgets and networks because I seem mercifully immune to their charms. David Brooks of The New York Times describes the social mediasphere as “a competitive, volatile attention economy”—the coin of this realm is attention. Attention, the raw material of celebrity, is not for everyone. It may be a seductive goal when you’re twenty, but only a hollow freak like Donald Trump still feeds on it at fifty. I was probably lucky, when I was very young, to work a job that involved a lot of professional contact with celebrities. The smart ones never denied the economic or sexual advantages of celebrity, but they were nearly all sick to death of the attention.
“Attention and affection have gone from being private bonds to being publicly traded goods,” writes Brooks, expanding his argument. “People ensconced in social media are more likely to be on perpetual alert: How are my ratings at this moment? If you orient your life around attention, you will always feel slighted. You will always feel emotionally unsafe.”
The hyper-earnest but eloquent Brooks is onto something here, something that may help us understand an epidemic of unhappy undergraduates. Has hive life become a life of perpetual performance, with every effort ranked and reviewed? Wouldn’t this obliterate a fragile ego, and chip away relentlessly at a strong one? But Brooks and I are only journalists, laypersons with no special knowledge of the brain’s pathways or their influence on human behavior. A more authoritative observer is the late Oliver Sacks, neurologist and bestselling author, who sounded an alarm in a New Yorker essay that was published shortly before he died.
“A majority of the population is now glued almost without pause to phones or other devices—jabbering, texting, playing games, turning more and more to virtual reality of every sort,” Sacks lamented. “Everything is public now, potentially; one’s thoughts, one’s photos, one’s movements, one’s purchases. There is no privacy and apparently little desire for it in a world devoted to non-stop use of social media. . . . Those trapped in this virtual world are never alone, never able to concentrate and appreciate in their own way, silently.”
In “our bewitched, besotted society,” Sacks continued, “younger people who have grown up in our social media era have no personal memory of how things were before, and no immunity to the seductions of digital life. What we are seeing—and bringing on ourselves—resembles a neurological catastrophe on a gigantic scale.”
“A neurological catastrophe.” Strong words, which carry greater weight because they were his critical parting shot, possibly the last words Sacks wrote. He was horrified by the spectacle of adults who seemed mesmerized by cell phones, but his is not essentially a Luddite argument—though I think Sacks, Brooks, and I, along with the prophet Birkerts, would all be proud to accept the “Luddite” label that has become such a pejorative in digital society. I don’t deny that I once dismissed Silicon Valley’s contribution to modern America as “a thousand solutions for which there were no problems.” But any fair critic will admit that digital technology offered positive changes in our basic communications. I embraced email because, like H.L. Mencken when it was an electronic miracle, I always hated the telephone. It enabled people to seize my attention when they were not welcome to it, and it was a time-consumer no polite person could control.
Sacks’s core argument is biological, and behavioral. It’s about the addiction, the powerful grip on the user that no engineer in microchip land could have anticipated, though no doubt it delighted his employers. Remember when the Blackberry was the trendy smartphone, and enslaved users dubbed it the “Crackberry”? Nearly every addictive agent offers real benefits to the addict. I understand that tobacco products delivered meaningful relief to generations of nervous and neurotic individuals. At the same time, unfortunately, their cigarettes were killing them in at least fifty different ways.
This is the way the digital revolution will be judged by history, and the judgment may be coming sooner than we think. Social media—in essence, communicating more unnecessary things to more unnecessary people, more efficiently—will most likely be the first target of the counterrevolution. It is out of control. It’s not just the extreme pathology the “attention economy” has begun to provoke: the college “boys” who gang-raped an unconscious coed and “posted” the video on their favorite platform, or the lunatic in Utica, New York, who slashed his seventeen-year-old date’s throat, nearly decapitating her, and then posted snapshots of her corpse on Instagram and Snapchat. The criminals’ impulse to share these atrocities is almost as frightening as the atrocities themselves. But the main threat to our collective sanity and survival, already provoking anxious reactions from parents all across the country, is the pandemic of screen addiction among children many years younger than college students.
Screen-time consultants, parenting coaches offering what they describe as “digital detox,” are prospering, charging up to $250 an hour for their services. Some of their recommendations are as simplistic as getting your kids a dog or a cat, or buying them a ball and teaching them to throw it or kick it. In Austin, Texas, and Concord, Massachusetts, parents groups have been organized around pledges to deny their children smartphones until they’re in the eighth grade at least. “Movement” is the key, according to one consultant—these innocent victims need to be reminded that they have bodies as well as fingers and thumbs. (Epidemic obesity seems to be another side effect of the digital revolution.) YouTube, Instagram, and the video game called Fortnite have been targeted as especially addictive distractions.
A kind of panic has set in, at least among relatively aware parents who aren’t helpless digital addicts themselves. Many of them realize that they’re the first parents in history who have to fight for their children’s souls, or at least their attention, against this whole seductive wired world, this buzzing superhive of amoral activity with an agenda that’s essentially commercial. These parents may be the main source of hope, and look what’s stacked against them. Trillions have been earned by the people who conceived the digital smorgasbord, and they’ll spend many billions to maintain and expand it. Politicians, many of them purchased by media moguls out of petty cash, will not be a useful counterforce. A powerful, responsible news source like The New York Times may publish David Brooks’s neo-Luddite musings on the evils of the attention economy, but just over the page there are articles on “Snapchat celebrities,” “TikTok stars,” and the potency of social media “influencers” written without any irony that I can detect.
There’s no proven connection between screen-addicted sixth graders and suicidal college sophomores, no available studies that trace individuals’ lives from their first cell phones and video games to their first attempt to overdose. But the data is there about their widespread misery at an age when most of us old-timers—the similarly lucky ones—felt undeservedly blessed. There’s a reason to be found somewhere, and you may be holding it in your hand.
No one is less qualified than I am to examine the digital experience from the inside, from the personal insights of a cyber-traveler. Few are less qualified to tell undergraduates to let go and cheer up. My sophomore roommate nicknamed me “MOCS,” or “Emmo” for short, for my frequent moaning renditions of my favorite folk song, “Man of Constant Sorrow.” Beckett and Dostoevsky were the first writers I couldn’t stop reading, Schopenhauer and Nietzsche were my philosophers of choice. Back when I had academic ambitions, I committed myself to the majestic melancholy of Thomas Hardy. The irony isn’t lost on me.
The world I grew up in is dead and gone, and it won’t be resurrected. The trick is not to turn back the clock, but to look ahead with our eyes wide open. Hive life may be here, but does it deserve to be here to stay? If it seemed to be making young people happier and more productive, I’d say “God bless you, then,” and retreat to the museum where I belong. Instead we read that 40 percent are too soul-sick to function, and that triage appointments for our universities’ psychiatric services have more than doubled since 2013. Please consider the possibility that we may be on the wrong track, and proceeding at a reckless speed.