You have /5 articles left.
Sign up for a free account or log in.
Cambridge University Press
A few articles and postings I’ve noticed lately take it as a given that life in the highly industrialized countries transformed in some deep but tangible way circa 2010—within a year or two, at most, on either side. Quite a few disrupted norms and forced readjustments in ordinary life either began then or followed in its wake. Three developments, in particular, define that period. One was the global financial crisis of 2008. Another, the increasing variety and ubiquity of mobile devices. And finally there was the arrival of social media as a factor in public life, soon to exude the subtle authority of an 800-pound gorilla.
Cause and effect among these factors interlocked in ways that make sense with hindsight. For example, it was clear by 2010 that ebooks were being taken up by non-technophile readers. This came after years of dire musings within the publishing industry, which had endured much “consolidation,” as the euphemism puts it, stemming from the recession. Was the change in reading patterns a cause or an effect of growing reliance on mobile screens? Both, probably. Likewise with the mutual exchange of influence between mobile devices and social media.
And so it became possible, and ever more routine, to produce, share and consume content of almost any sort (instantaneously, or just about) with no restraint and seldom much accountability. The potential for unfettered creativity proved enormous, as did the potential for incessant self-aggrandizement and gutless malevolence. Strangely, this no longer seems strange.
“I did not decide in 2009 to prioritize screen time over live relationships,” writes Gaia Bernstein in Unwired: Gaining Control Over Addictive Technologies (Cambridge University Press). The indicated year, which falls within the epochal-shift pocket, was when the author and her friends, family and colleagues started relying on smartphones and social media to stay in touch. (The author is a professor of law at Seton Hall University.)
“I did it gradually, and at least initially, through a series of specific decisions,” she explains. “But over time, I ended up spending an alarming part of my waking hours online. Technology makes us especially vulnerable to finding ourselves in unanticipated places. Once we get used to technology it often becomes invisible … This is particularly true for digital technologies, where much more is hidden than is seen.”
The hidden element referred to here is not a device’s hardware but, rather, the behavioral engineering incorporated into social apps, in particular. They are designed to absorb as much of a user’s time, attention and personal information as possible by delivering an addictive little surge of neurochemical gratification when the user checks the app and finds notifications. The impulse to reach for the device is cultivated through such standard features as “pull to refresh.” Pulverizing the individual’s attention span to sell off the fragments is the core of the business model. This is not speculation. Whistle-blowers from the tech industry have documented as much in recent years.
Bernstein cites a national survey from 2019 showing that children between 8 and 12 years old “spent, on average, five hours on screens per day, while teens spent on average seven and a half hours” (not counting time spent on schoolwork). That lines up with another 2018 study’s finding that 45 percent of teenagers said they were online “almost constantly.” The impact of the pandemic on screen time was unsurprising: researchers determined that “the percentage of kids of all ages spending more than four hours daily nearly doubled.”
The cumulative impact of heavy screen usage includes “significant increases” in “anxiety, depression, self-harm, and suicide,” particularly among girls. In a study at the University of Pennsylvania, one group of students “limit[ed] Facebook, Instagram, and Snapchat use to ten minutes, per platform, per day,” while another consumed social media in their normal manner. After three weeks, those with a restricted intake “showed significant reduction in loneliness and depression … as compared to the control group.”
Bernstein notes that an internal review by Facebook “showed that ‘problematic use’ affects 12.5 percent of Facebook users.” Curious what qualified as “problematic use,” I found a report from 2021 explaining that it covered “compulsive use … that impacts their sleep, work, parenting or relationships.” While Facebook implemented some of the recommendations made by its team focused on “user well-being,” perhaps the most decisive action it took was shutting that team down.
In 2017, Bernstein started lecturing to groups of concerned parents on the benefits of digital connection and the risks of its overuse, advising them on ways to limit kids’ time online. Efforts to do so rarely had the desired effect, or not for long. Parental-control passwords are, it seems, made to be broken. In discussion periods, much frustration came to the surface—as well as a lot of self-blame, as if inculcating sound digital hygiene were a parental responsibility that people felt they were failing to perform.
Some of the self-blame probably also derived from parents’ struggles to get a handle on their own time online. The author is candid about her own susceptibility to the lure of social media, and makes a few references to the struggle for balance in her own life.
But Unwired is neither a screen-junkie confessional nor a recovery handbook. Bernstein regards framing the issue as ultimately one of self-control as part of the problem. So is the fatalistic strain of technological determinism that treats the impact of a given invention as more or less inevitable.
What we have with social media, she argues, is akin to the effects of smoking or of trans fat in food. These are now understood to be matters of public health, but for decades the respective industries had a vested interest in subsidizing bogus controversy, in the case of tobacco, or ignoring the issue for as long as possible, as food manufacturers did with evidence that trans fats increased the risk of heart attack, stroke, and type 2 diabetes.
An oft-repeated sentence from Upton Sinclair seems germane: “It is difficult to get a man to understand something when his salary depends on his not understanding it.” And even if he does understand it, the salary will remain a priority. Documents showed that Big Tobacco not only long knew its product damaged users’ health but was, as media outlets reported in the 1990s and as the industry has since admitted, adjusting nicotine levels in cigarettes to make them more addictive. (Getting new smokers hooked as fast as possible made sense, given that longtime users tended to die off at disproportionately high rates.)
Bernstein points to the troves of information made public by Silicon Valley whistle-blowers over the past few years to argue that time has come for legislation or litigation, or both, to mitigate social media’s damage to public well-being. The message of Unwired is, in short, that we need fewer digital detox workshops and—à la tobacco—a lot more class action lawsuits. There’s more to her argument about strategy and tactics, of course, but that would fit on a bumper sticker, which is a relevant consideration.
“With all we now know,” we read in the book’s opening pages, “it seems increasingly unlikely that we would have opted for all of this, had we known this information [about social media toxicity] around 2009, when we had the opportunity to choose.” Probably not, but the thought experiment is hard to conduct, in part because it is difficult to imagine who, or what institution, could have framed the question or enforced the decision.
The same consideration applies to making social media socially accountable. Bernstein is shrewd about the political maneuvers and public relations options available to industries challenged for doing harm to the general welfare. At the same time, she shows that imposing some control or countermeasures—no-smoking areas, for example, or food packaging that gives nutritional information—has been possible in the past, and might be in the future.
It’s worth a try, or a whole series of tries. But that will mean somehow defending public health or the common good when large swathes of the population doubt either one exists.