The 2.0 Decade?

April 2, 2008

Last week Intellectual Affairs discussed the effects of an irresistible force on an immovable object. The force in question is our habit of referring to each recent decade as if it had a distinct quality or even personality: the '50s as an era of straightlaced conformity, for example, or the '70s as (in Tom Wolfe’s phrase) “the Me Decade.” This tendency has dubious effects. It flattens the complexity of historical reality into clichés. It manifests a sort of condescension to yesteryear, even. But decade-speak is a basic element in ordinary conversation -- and the habit is so well-established as to seem, again, irresistible.

If so, the past several years have been a period that won’t budge, if only because we lack a convenient and conventional way to refer to it. Expressions such as “the Aughties” or “the Noughties” are silly and unappealing. I ended the last column by asking what readers were calling this decade. Many of the responses involved expressions of disgust with the era itself, but a couple of people did propose terms for what to call it. One suggestion was, simply, “the Two Thousands,” which seems practical enough. Another was “the 2.0 Decade” -- an expression both witty and apropos, though unlikely to catch on.

Perhaps some expression may yet emerge within everyday speech -- but we’re winding down “the Zeroes” (my own last-ditch suggestion) without a way of referring to the decade itself. For now, it is an empty space in public conversation, a blind spot in popular memory. That may not be an accident. The past several years have not exactly been lacking in novelty or distinctiveness. But the tempo and the texture of the changes have made it difficult to take everything in -- to generalize about experience, let alone sum it up in a few compact stereotypes.

Rather than give an extensive chronology or checklist of defining moments from the present decade -- whatever we end up calling it -- I’d like to use the rest of this column to note a few aspects of what it has felt like to live through this period. The effort will reflect a certain America-centrism. But then, decade-speak is usually embedded in, and a reflection upon, a particular national culture. (German or Italian musings on “the Seventies,” for example, place more emphasis on the figure of the urban guerrilla than the disco dancer.)

These notes are meant neither as memoir nor as political editorial, though I doubt a completely dispassionate view of the period is quite possible. “Your mileage may vary,” as a typical expression of the decade goes. Or “went,” rather. For let’s imagine that the era is over, and that time has come to describe what things felt like, back then....

The decade as unit of meaning does not normally correspond to the exact chronological span marked by a change of digits. We sometimes think of the Eighties as starting with the election of Margaret Thatcher in 1979, for example, or Ronald Reagan’s inauguration in 1981. Conversely, that period can be said to end with the tearing down of the Berlin Wall in 1989, or as late as the final gasp of the Soviet state in 1991. The contours, like the meanings, tend to be established ex post facto; and they are seldom beyond argument.

In this case, there was a strong tendency to think of the decade as beginning on the morning of September 11, 2001 -- which meant that it started amid terror, disbelief, and profound uncertainty about what would happen next. Within a few weeks of the attacks, there would be a series of deaths from anthrax-laced envelopes sent through the U.S. mail. (Among the puzzles of the entire period is just why and how the public managed to lose interest in the latter attacks, even though no official finding was ever made about the source of the anthrax.)

Over the next two years or so, there would be a constantly fluctuating level of official “terror alerts.” Free-floating anxiety about the possibility of some new terrorist assault would become a more or less normal part of everyday life. Even after early claims by the administration of a connection between Saddam Hussein and the 9/11 terrorists were disproven, a large part of the public continued to believe that one must have existed. Elected officials and the mass media tended not to challenge the administration until several months into the Iraq War. The range of tolerated dissent shrank considerably for at least a few years.

Simultaneously, however, an expanding and wildly heterogeneous new zone of communication and exchange was emerging online -- and establishing itself so firmly that it would soon be difficult to recall what previous regimes of mass-media production and consumption had been like. The relationship between the transmitters of news, information, and analysis (one the one hand) and the audience for them (on the other) tended to be ever less one-way.

It proved much easier to wax utopian or dystopian over the effects of this change than to keep up with its pace, or the range of its consequences.
At the same time, screens and recording devices were -- ever more literally -- everywhere. Devices permitting almost continuous contact with the new media kept getting smaller, cheaper, and more powerful. They permeated very nearly the entire domain of public and private space alike. Blaise Pascal’s definition of the universe began to seem like an apt description of cosmos being created by the new media: “an infinite sphere, the center of which is everywhere, the circumference nowhere.”

Quoting this passage, Jorge Luis Borges once noted that Pascal’s manuscript shows he did not originally describe the sphere as infinite. He wrote “frightful” instead, then scratched it out. Looking back on that unnamed (and seemingly unnameable) decade now, it seems like the right word. Whatever meaning it may yet prove to have had, it was, much of the time, frightful.


Search for Jobs


  • Viewed
  • Commented
  • Past:
  • Day
  • Week
  • Month
  • Year
Loading results...
Back to Top