Five years ago this month, a consortium of major telecommunications carriers (AT&T, T-Mobile and Verizon) announced that it was developing a new application that would enable customers to pay for goods and services using their smartphones. This “mobile wallet,” as such technology is commonly called now, would make credit card and debit account information available to merchants by wireless.
Other enterprises, including banks and American Express, soon joined the partnership. The application seemed well positioned to enter the market for hyperconvenient consumerism, even to dominate it. But things did not work out that way. A demo during the plenary session of a major conference on new payment technologies in 2013 went badly. Consumers complained that the app’s “setup and payment processes were cumbersome and frustrating,” in the words of Chris Welch in The Verge. But those were minor scratches compared to the self-inflicted fatality of the app’s name: Isis. It gets worse. A gift card with the words “serve ISIS” was circulating even after the product’s name was changed to Softcard in 2014.
“Probably few consumers even knew of its existence until the media bump it received from its rebranding,” Bill Maurer notes in How Would You Like to Pay? How Technology Is Changing the Future of Money (Duke University Press). That bump was clearly not enough: Softcard shut down early this spring. At the same time, the range of mobile wallets on sale has only been increasing. The information technology research firm Gartner estimated the value of mobile payments for 2012 was $163 billion worldwide and anticipates it will reach $720 billion by 2017.
“This is definitely an ecosystem in flux,” one business and technology columnist wrote last month, “partly because there are so many players offering so many different solutions -- and so many questions about compatibility and security.”
“Ecosystem” seems an interesting choice of words, in this context. Maurer, an anthropologist who is also dean of the school of social sciences at the University of California at Irvine, also uses it -- but in a much thicker sense than as a synonym, more or less, for “market.” The smartphone wallet represents only one means of mobile payment, limited mainly to the world’s more prosperous sectors. It’s in the poorer countries of the global South that mobile payment (using phones with text-messaging capabilities and maybe a little built-in flashlight) looms as a much larger part of everyday life: an economic and social link between urban and rural areas.
In Kenya, the M-Pesa mobile payment service launched in 2007, and within three years, more people were using it than had bank accounts. Over half of the country’s households had adopted it by 2011, and Maurer writes that M-Pesa “processed in that year more transactions within Kenya than Western Union had done globally.”
The contrasting fortunes of Isis/Softcard and M-Pesa (where M stands for “mobile” and “pesa” is the Swahili word for money) are striking; how well each met the demands of the people using them obviously differed significantly. But Maurer’s interest runs deeper than the great disparities between the respective societies.
We’re prone to think of money as a medium of exchange originally created to get around the vagaries of barter (e.g., it’s hard to make change for a goat) and also as a tool notoriously indifferent to how it’s used. With $10,000, you can furnish your apartment or hire a contract killer. Money itself, so understood, is both fungible and morally inert. And from that perspective of money, the recent technological innovations in how it can be transferred from one person or place to another are significant chiefly for whatever changes are made in speed, ease or degree of anonymity of the exchange.
Maurer’s subtitle seems to promise speculation on how money will change, but his stress on the idea of payment (or, better, payment systems) has a decidedly retrospective component. In the abstract, the value of $10,000 in cash is the same as that of $10,000 in diamonds, bitcoins or traveler's checks. Each can be used as a form of money, for exchange.
But in practice, different kinds of social infrastructure are involved in making the transaction feasible -- or even possible -- with considerable implications about the relationships among the people involved. I have not made the experiment, but I doubt you can buy furniture using diamonds, and paying a hit man with a money order seems like a bad idea.
At some point, bitcoins might have the nearly universal acceptance that cash now does; both are fungible and, in principle, anonymous. But those qualities do not inhere in the paper or digital currency themselves: each is part of a payment system, without which it would be worthless. The same is true of credit cards, of course, or smartphones-turned-wallets.
Money of whatever sort is an “index,” the author says, of “relationships of obligation, rank, clientage, social belonging or state sanction.” Furthermore, old payment systems don’t necessarily die off; more than one can be operating in a given society at the same time. Maurer describes the interesting and intricate ways long-distance charge cards have become integrated into African economies where cash and barter also have a place. Aware of the fantastically destructive effects of the last financial crisis, he is clearly concerned that the advantages of being integrated into the global economy could be wiped out in the long term, through no fault of the continent's mobile users themselves.
In the end, How Would You Like to Pay? is of interest less for what it says about the future (the author makes no predictions -- which, given the Isis debacle, seems prudent) than for how it encourages the reader to pay attention to nuances of the present. It’s a primer of the anthropological imagination -- and a reminder that money is too important a matter to leave to the economists.
Many academics are signing a petition and sharing their concerns about the future of Ashgate Publishing, which was purchased by Informa (the parent company of Taylor & Francis). Both Ashgate and Taylor & Francis publish scholarly books and are seen as important venues for professors' work. The petition appeals to Informa to stop a planned closure of an Ashgate office in the United States, and rumored closings of an office in Britain.
"Independent academic presses like Ashgate have offered a safe haven for scholars working in certain subfields as university presses closed entire publishing specializations and fired editorial staff in response to campus austerity measures," says the petition. Representatives of Informa and Taylor & Francis did not respond to email messages seeking a response.
Between the Boston Marathon bombing and too many spree killings by heavily armed men with grievances to keep count, we’ve all had plenty of recent experience with 24-hour coverage of horrific events such as the Paris massacre last Friday. There seem to be two major ways to manage attention. They don’t exhaust the possibilities but definitely mark the limits of a not especially broad spectrum.
One option is to keep track of breaking developments more or less in real time -- in extreme cases, checking for updates every few minutes. You expect the worst but try to get a jump on it, somehow, by absorbing each new crumb of pertinent information as it becomes available. The opposite extreme is to make like an ostrich and find some sand. Or at least to wait for fact, rumor and guesswork be sorted out. Only at that point does catching up make sense; until then, there’s more noise than signal.
By temperament I lean toward the first pattern: obsessive scanning. But not after Friday. Maybe the thought of the worst being yet to come was too much to handle. In any event, I opted for burying my head in a couple of recent books, starting with Michael Griffin’s Islamic State: Rewriting History, published this month by Pluto Press and distributed by the University of Chicago Press. The other, The Rise of Islamic State: Isis and the New Sunni Revolution by Patrick Cockburn, was released by Verso earlier this year. Neither can be recommended to anyone whose nerves are easily jarred. But they give a much thicker account of the group that inspired the attacks than nonspecialists can piece together from news reports over the past couple of years.
The authors seem to have turned in the manuscripts to their respective publishers around this time in 2014. At that point, the potential for the new “caliphate” to inspire terrorism beyond the Middle East was a less pressing issue than its unprecedented arrival as a force in the region.
Cockburn’s book, which incorporates his reporting on the Islamic State in Iraq and Syria for the London Review of Books, vividly conveys the speed and range of the group’s expansion and consolidation: ISIS, “as though intoxicated by its own triumphs,” proved capable of “defeating its enemies with ease even when they were more numerous and better equipped.” From one of a number of “nonstate actors” in the region, ISIS transformed into something that, beyond simply proclaiming itself as the Islamic State, effectively dissolved the border between Iraq and Syria and imposed its own religious and military authority over “an area larger than Great Britain and inhabited by some six million people -- a population larger than that of Denmark, Finland or Ireland.”
At the same time, ISIS remains, if not invulnerable to air strikes, then certainly prepared for them. It “operates as a guerrilla army,” Cockburn says, “without easily visible movements of personnel or equipment that can be targeted. Its leadership is well practiced at keeping out of sight.”
But it’s Griffin’s book that actually tells in detail the story of where ISIS came from and how it transformed over time. The author is a political analyst and commentator for BBC World and Al Jazeera. He draws almost entirely on English-language publications, in contrast to Cockburn, who quotes an array of friends, interview subjects and bits of popular culture from around the Middle East. But Griffin integrates his sources to good effect. He traces the growth of ISIS out of what had been Abu Musab al-Zarqawi’s organization Al Qaeda in Iraq -- a group that had managed to alienate both Osama bin Laden and Iraqi insurgents fighting the U.S. occupation. The death of Zarqawi in 2006 seems to have created less of a power vacuum than an opening for more capable strategists to assert themselves.
They learned to adapt to and exploit specific local and tribal concerns while building up both an effective economic infrastructure and formidable propaganda skills, taking advantage of the new-media skills of European-born jihadis who joined them. The ISIS cadre were also exceptionally lucky -- astonishingly, uncannily so -- about getting hold of new weaponry and tools. The psychological impact of the fall of Mosul in June 2014 was magnified by the sight of ISIS fighters “speeding towards outflanked enemies in hundreds of looted Humvees, bristling with assault rifles and rocket launchers.”
Plus the jihadis had “a fleet of white Toyota Tacoma pickups, double cabbed with mounted machine guns.” The vehicles, custom-made for U.S. Special Forces, were only available from a Toyota assembly plant in Texas. “How they managed to reach the frontiers of the caliphate,” Griffin says, “is anyone’s guess.”
Reading these books quickly was difficult, and the marginal notes and highlights I made along the way are evidence of how much more time it would take to grapple with them -- especially with regard to the authors’ differing understandings of the Arab Spring, and of the Syrian uprising in particular.
What they concur on, and no surprise, is the emerging status quo, with the Islamic State obliging the United States and Iran to act as allies for the foreseeable future, despite the saber-rattling impulses toward one another. The situation was paradoxical and hard to imagine as stable even before the terrorist attacks of last week. Is there even a word for how things stand now?
Someone once characterized the intellectual as a person “living by, for and off of ideas.” Another remark in the same terse vein calls the intellectual someone who habitually reads with pen in hand.
Neither definition would pass muster with historians or sociologists, but they are ideal for ordinary usage. Each takes the distinguishing feature of the intellectual to be certain ingrained ways of directing the attention -- without making any claim about his or her personal qualities or social status. This has the advantage of keeping snarling to a minimum. (No other noun provokes so much insecurity and hostility that people often feel compelled to put “pseudo-” or “self-described” in front of it.)
The trouble with such placid or neutral definitions, however, is that “les intellectuels” were born in a scene of great hostility and named amid a prolonged exchange of insults. Christophe Charle’s Birth of the Intellectuals 1880-1900 (Polity) uses the French word when referring to the academics and authors who rallied to the defense of Alfred Dreyfus in the late 1890s and the English word “when it is to be understood in the broader sociological sense.” (Charle is a professor of contemporary history at the Université Paris 1 Panthéon-Sorbonne.) The distinction is useful, but the thrust of book’s argument is that the persisting ambiguities and disputes over the concept -- even within sociology -- were already taking shape before the intellectuels rallied behind Émile Zola’s manifesto J’accuse in 1898.
That the author is a historian would not necessarily be a reader’s first guess. Charle draws so heavily on the ideas and perspectives of Pierre Bourdieu that I assumed he was a sociologist using educational and literary developments in France in the late 19th century for a case study of the field of cultural production. Anyone not already acquainted with the Dreyfus controversy, at least in broad outline, is bound to give up quickly: this is historical writing in which the narrative element approaches the vanishing point.
Instead of events, Charle reconstructs the categories and niches of brainwork during the late 19th century as each established its role in French society. At the same time, each was also marking off its own respective criteria for recognition and advancement. Republican principles, established by a century of revolutions, did not preclude the emergence of an elite -- but it had to be based on merit rather than bloodline, a “nonexclusive aristocracy” cultivated by the educational system.
Louis Pasteur served as an exemplary case: a man of modest origins, he had contributed to the well-being of humanity (the rabies and anthrax vaccines) and advanced knowledge (the germ theory) while bolstering the French economy with his method for keeping wine and milk from going sour. He was the patient, methodical laboratory researcher as national hero: his expertise possessed a recognizable social value.
Supplementing the tremendous prestige of the sciences was the model of specialized, highly professionalized scholarship in all disciplines practiced in German universities. So teaching and research could be understood and valued -- by those who practiced them and by laypeople alike -- as matters of public importance. That was true even when scientists and scholars prided themselves on remaining so concentrated on their areas of specialization that they ignored everything else.
The situation among novelist, poets, essayists and other writers was altogether murkier -- in part because of the rapid and chaotic nature of the publishing industry, especially given its susceptibility to economic pressures. The number of aspirants always exceeds the number of positions offering a writer access to readers or money (much less, as in the best case, both). And the range of available outlets for publication tends to interact with writers’ own interests, styles and degrees of mutual hostility in fairly volatile ways. Whether new literary movements create new literary journals or vice versa can only be determined on a case-by-case basis; even then, it will be partly guesswork.
The term “intellectuel” seems to have been coined in France in the early 1890s, in the small but serious journals of debate written and edited by, well, intellectuals -- that is, writers, academic and otherwise, who expressed political and cultural opinions largely critical of the established order. The word is often said to have entered English as a neologism in the wake of the Dreyfus case. (Charle seems to make the same assumption, although Stefan Collini quotes Byron using it as early as 1813.)
In any event, the intellectuels who intervened to defend Dreyfus -- accusing the French military of anti-Semitism and of covering up evidence that would exonerate him -- were drawn from the ranks of the professoriate as well as writers, both creative and journalistic, established and otherwise. Charle goes over their social backgrounds, career trajectories and political affiliations exhaustively. In analyzing the statements they wrote and published, he pays close attention to how famous names and distinguished institutional affiliations were sometimes featured prominently to signal their authority and the seriousness of the cause. At other times, the indicators or prominence were downplayed: the list of signatures might have an illustrious professor alongside an obscure poet or an ordinary citizen.
The debate over Dreyfus quickly spun off what sounds like an equally nasty one over whether the intellectuels were heirs of Voltaire’s role as the voice of reason and justice against oppression, or just people interfering in military matters for which their education and verbal skills gave them no claim to competence. Because of its challenge to authority and the involvement of many figures with known anarchist or socialist tendencies, the pro-Dreyfus cause was largely understood as a movement of the left -- which inspired the anti-Dreyfusards to come up with accusations that they were radical elitist hypocrites. (A bit rich, given that those denouncing Dreyfus as a traitor included people who wanted to restore the monarchy.)
As for Dreyfus, he was exonerated a few years later. The notion that intellectuals can, and should, play some role as critics of the society they live in was established. Debate over how well they perform that function never ends, nor should it. And the snarling, of course, continues unabated.
The impending collapse of civilization should, as Samuel Johnson said about being hanged in a fortnight, wonderfully concentrate the mind. For most of the interview subjects whose responses Matthew Schneider-Mayerson analyzes in Peak Oil: Apocalyptic Environmentalism and Libertarian Political Culture (University of Chicago Press), that collapse is inevitable, if not already underway.
“Peakists” skew, on the average, pretty far to the left of the stereotypical American survivalist in ideology, but there is a meeting of minds on strategy. Peak-oil activism -- as the author, an assistant professor of social sciences at Yale-NUS College in Singapore, presents it -- consists mainly of: (a) stockpiling necessities, (b) consuming less and (c) blogging while you still can. This sounds awfully unambitious, even by the standards of a politics of diminished expectations.
Schneider-Mayerson’s questionnaire drew responses from about 1,750 committed adherents of the peak-oil scenario in 2011. That year now looks like the end of peak oil’s era of maximum public exposure. My own unscientific survey of otherwise well-informed people suggests that the whole concept is less than universally familiar, so first a word of explanation.
The claim that oil production has peaked, or will soon, is grounded in a hard ecological and economic reality: as the pool of oil in a well shrinks, it takes more effort and expense to pump out. The return on investment will eventually hit zero. An enormous amount of petroleum remains underground, but the energy consumed in extracting each barrel will exceed the energy produced by burning it. And once we reach that point on a worldwide scale -- as must happen, sooner or later, when the last untapped deposit has been located and exploited -- the effect can only be catastrophic.
Over the past 150 years or so, petroleum has been both abundant and relatively easy (hence profitable) to extract. Huge, complex and interlocking institutions and technologies became possible thanks to eons’ worth of solar energy condensed in liquid form by the decay and burial of vegetation over untold millions of years. The next 150 years do not look quite so promising.
Nor do the next 15, really, if some of the peak-oil extrapolations are valid -- in which case the Mad Max films may count as utopian, since Mel Gibson at least had some functioning oil rigs to protect. (More than three-quarters of Schneider-Mayerson’s respondents indicate that they had seen the films, and it’s a fair guess more than once.) Quite a few counterscenarios come to mind, including the development of other energy sources or of more efficient ways to extract, and use, the black gold itself.
But peakists can always point to the undeniable reality that advanced industrial societies are dependent on a fuel that must run out. And facing that inevitability “was often revelatory,” the author says, noting that “the gulf between their conception of the future before and their conception of the future after their awakening is so stark that this moment often cleft their lives in two.”
Those who filled out Schneider-Mayerson’s questionnaire in 2011 tended to be middle-aged, middle-class white American men with higher educations (more than 43 percent had postgraduate degrees). They characterized their views as “liberal” or “very liberal” (about half) and reported their religious preference as “none” (also about half). They constituted “a vibrant social formation that existed from roughly 2005 to 2011,” when the largest peak-oil news sites and blogs were drawing hundreds of thousands of readers per month. At least one novel set in the postpeak future, James Howard Kunstler’s World Made by Hand (2008), was widely reviewed, with the author laying out the premises in an interview on The Colbert Report.
The flourishing of this subculture coincided with the doubling of the price of gasoline in the United States throughout this century's first decade. And the demographics of Schneider-Mayerson’s interview population suggest that anger at the George W. Bush administration -- in particular its foreign policy -- may also have spurred interest in scenarios of life after petroleum. The movement seems to have reached its own peak in the wake of the 2008 credit crunch. The new decade brought aggressive campaigns to promote and exploit alternatives to drilling (coal, natural gas, tar sands). And not having enough hydrocarbons to burn is not exactly a pressing issue as the reality of anthropogenic climate change sinks in.
But a follow-up questionnaire, in 2013, found that only 10 percent of those whom the author surveyed in 2011 “had significantly questioned their dedication to peakism, and the vast majority stood firm in their convictions and life course.” Peakism has been called a sort of Left Behind for liberals, and apocalyptic sects are known, after all, for proving remarkably resilient.
The language of religious conversion is hard to avoid. The crisis underscored by peakism is in large measure an existential crisis, even a crisis of faith. Believers experience a moment of truth -- of grasping that the values and ways of life they have taken for granted are embedded in, and reliant on, a society that depends on a substance that cannot be replaced. The literal meaning of the word of apocalypse is “uncovering,” and what the peak-oil scenario uncovers is something like an abyss.
Schneider-Mayerson notes that around two-thirds of respondents indicate that they’ve found it difficult to talk about their beliefs with others, who often take it as an attack on their own lifestyles or an obnoxious display of pessimism. And the subculture seems both inward turning and remarkably asocial. More than 60 percent of respondents indicated that they had never attended in-person meetings with others who shared their concerns. Almost a quarter said they visited peak-oil website more than once per day. The very word “movement” seems out of place. Movement is exactly what peakist ideology did not encourage, even at its height -- unless you count buying a more fuel-efficient car, which is really stretching it.
Schneider-Mayerson interprets the tendency towards insularity and inactivism as a sign of the peak-oil subculture having accepted more of the dominant mentality of the past few decades than one might expect: in particular, a deep distrust of collective action, and of the state as capable of doing anything without screwing it up, combined with fatalism and an abiding sense of powerlessness. And feeling powerless, one places no demands on those who do have power (the first step toward gaining any). The belief in an inevitable collapse and disintegration of society is stupefying, if not self-fulfilling. “There is no alternative,” it says. “Let us tend our gardens.” But that's no strategy, just a symptom of decline.
A faculty member at California State University at Fullerton is fighting back after he was reprimanded for assigning affordable textbooks in a math course, The Orange County Registerreported. Alain Bourget, assistant professor of mathematics, reportedly picked two textbooks -- one priced at $76, the other free -- in an introductory linear algebra and differential equations course over the $180 textbook co-written by the chair and vice chair of the math department. The decision "violated policy and went against orders from the provost and former dean of the math and sciences college," according to the newspaper. Bourget, who did not respond to a request for comment, has filed a grievance and will attend a hearing on Friday.
Every so often in a Victorian novel or the biography of someone of that era, you will come across a mention of “Lyell on geology” that often implies something momentous and perhaps a bit mind-boggling for the person grappling with it. Or it might be to someone so old-fashioned as to have been unaffected by the challenge. It evinces an odd image of ladies and gentlemen in their drawing rooms, wearing heavily starched clothing and excited, or distressed, by something involving rocks.
At issue was the three-volume Principles of Geology by Charles Lyell -- an international best-seller published in the early 1830s and still much discussed upon the author’s death in 1875. While hardly the first natural philosopher to challenge the literal truth of the Book of Genesis, Lyell made the most far-reaching and cogent argument that the earth’s features (mountains, gorges, the course of rivers, etc.) could be explained by slow changes over extremely long periods of time. Among Lyell’s readers, no surprise, was the young Charles Darwin, who studied the Principles while voyaging on the Beagle.
One way to put it is that Lyell sank Noah’s Ark. But the damage to orthodox religious belief was only part of the Principles’impact. There was also the strain of imagining the scale of time implied by “Lyell on geology” -- a phrase we should probably read as implying more than the replacement of “catastrophism” by “uniformitarianism” (terms introduced as the accepted explanation for environmental change). For it was also the moment when human history shrank to an almost inconceivably tiny aspect of natural history, like a speck of dirt atop a mountain.
Reading Paul B. Wignall’s The Worst of Times: How Life on Earth Survived Eighty Million Years of Extinction, from Princeton University Press, can induce something of that perturbed feeling. It did in me, anyway, as I tried every so often to picture a timeline of the catastrophic events that Wignall and his colleagues have reconstructed. (The author is a professor of paleoenvironments at the University of Leeds.)
The geologically unsophisticated layperson will probably anticipate new ideas or evidence about what killed the dinosaurs. But that’s an index of how limited an impact Lyell has had. We still imagine change on too constricted a scale. The rise and fall of Jurassic wildlife are, for Wignall, something like last week’s news might seem to an ancient historian: interesting enough, sure, but the author would really prefer to stay focused on the past and not get sidetracked chattering about recent trends.
The catastrophic events covered in The Worst of Times affected life on Pangaea, the vast landmass that took shape 300 million years ago and disintegrated into pieces that drifted across the globe to form the continents we have now.
Two mass extinctions -- defined as “geologically brief intervals when numerous species go extinct in a broad range of habitats, from the ocean floor to forests, and all latitudes, from the Equator to the poles” -- had already taken place in very distant pre-Pangaean times, but the formation of the super-continent seems to have accelerated the pace of disaster: in the period between 260 and 180 million years ago, two of Earth’s five known mass extinction events took place, along with four other extinction episodes of smaller scale or impact.
That leaves one mass extinction unaccounted for: the crisis following the impact of a giant meteor hitting what is now the Yucatan Peninsula, 65 million years ago, ending the reign of the dinosaurs, among other species. That was a good 100 million years after Pangaea’s fragmentation got well underway, and the continents that existed during the fifth mass extinction event are recognizable in their current form, if not location, from one of the maps on the U.S. Geological Survey's website.
The very idea of Pangaea has always fascinated me (insert nerd emoji here) yet the evidence suggests it was a difficult place for evolution to happen. In fact, that is an understatement: Wignall’s reconstruction of the deep history suggests that Pangaea was not just the scene of disasters but also a major factor in their scale and frequency.
The issue was volcanoes, and not just the piddling sort of modern times that could wipe out a city or two. The biggest volcano of the last thousand years produced about 30 cubic kilometers of magma, while a given Pangaean volcano (one of an untold number) threw out millions of cubic kilometers. The eruptions filled the atmosphere with carbon dioxide while also setting off chain reactions that created hot, de-oxygenated, acidic oceans, killing off much marine life.
The fracturing of Pangaea did not mean a complete end to monster volcanoes and their sundry terrible side effects (including periods of climate change, up and down in temperature). But Wignall suggests that the supercontinent’s eventual dispersal into smaller landmasses created better conditions for evolution -- and even for simple survival.
“A huge continent has vast areas in the interior that are too far away from the sea to receive much rain,” he writes. “In contrast, smaller, more fragmented continents receive precipitation over a greater area …. Continental runoff also supplies nutrients to the oceans, which stimulate plankton growth that removes more carbon dioxide, which gets buried as organic carbon in marine sedimentary rocks.”
Which, in turn, makes for relatively more moderate short-term changes in climate. One benefit of studying volcanic activity of the Pangaean era is that it created “effects that may be akin to modern anthropogenic activity, such as the emission of huge amounts of carbon dioxide into the atmosphere.” The author avoids speculating on the more dire potential implications, but indicates that having dispersed continents now is at least some advantage.
As a side note, let me mention the surprise at seeing a geologist use the word “catastrophism” in a neutral way. Evidently uniformitarianism is no longer quite the bedrock principle that it seemed in the wake of “Lyell on geology” -- a development that believers in Noah’s Ark seem to find encouraging. But catastrophism as Wignall and his colleagues understand it means recognizing that the unimaginably vast stretches of time in which the earth changes slowly have been punctuated, on occasion, by cataclysmic events lasting a few hundred thousand years. On the grand scale, that counts as sudden change. But 40 days and 40 nights it isn’t.