Zygmunt Bauman makes a passing reference to his “uncannily long life” in On Education: Conversations with Riccardo Mazzeo (Polity). I found it surprising to determine that he is, in fact, not quite five months shy of his 87th birthday. With all due respect to someone old enough to have experienced the Hitler-Stalin pact as a personal problem (his Polish-Jewish family had to emigrate to the Soviet Union), current demographic trends are making longevity seem astounding only when your age runs to three digits.
Such judgments are always relative, of course. What the man is, without a doubt, is freakishly prolific. By the time the University of Leeds made him professor emeritus of sociology in 1990, Bauman had published some 25 books. At least two of them, Legislators and Interpreters: On Modernity, Post-Modernity, Intellectuals (1987) and Modernity and the Holocaust (1989), qualify as masterpieces. (Both were published by Cornell University Press.) Since retiring Bauman has published another 40 books, more or less. At this point, the author himself has probably lost count.
Bauman’s last few books have been assemblages of commentary on a variety of topics. On Education certainly belongs to that cycle. It consists of 20 exchanges with Riccardo Mazzeo, an editor at the Italian publishing house Edizioni Erickson, conducted by e-mail from June to September 2011. Mazzeo poses a question or makes an observation, sometimes on education and sometimes not. Bauman replies at length, and usually at a tangent. Its title notwithstanding, the book is neither focused on education nor, really, all that conversational. What it resembles more than anything else is the set of essays and notes gathered last year under the Magritte-ish title This is Not a Diary (Polity, 2011).
For the past dozen years, Bauman has been writing about what he calls the “liquid modernity” of contemporary industrialized and digitalized societies: the structure-in-flux emerging from a confluence of technological innovation, consumerism, and constantly changing demands on the adaptability of the labor force. His thinking about the unstoppable cultural torrent of liquid modernity resembles a combination of Daniel Bell’s sociological work on The Coming Post-Industrial Society (1973) and The Cultural Contradictions of Capitalism (1976) with Jean-Francois Lyotard’s reflections on The Postmodern Condition (1979), as updated via Thomas Friedman’s globalization punditry in The Lexus and the Olive Tree (1999).
Not that Bauman is a mash-up theorist. I cite these authors only by way of triangulation, rather than as influences. His work is grounded, rather, in the founding concern of sociology in the 19th century: the effort to understand the world taking shape under the impact of industrialization. The pace of change was much quicker than was imaginable in pre-industrial times, and the span of transformations was much wider. The classic period of sociology (the days of Marx, Durkheim, and Weber) analyzed how modernity differed from and disrupted -- but sometimes also absorbed and refashioned -- the institutions and traditions established by earlier ways of life.
Along the way, it came to seem as if the shifts and upheavals of industrial society could be understood and even (this was the imagination of the technocrat and the ideologue kicking in) brought under control. And if you couldn’t engineer change, at least it was reasonable to assume you could plan for it. If the population of a city is likely to grow at a certain rate, for example, it would be feasible to project whether more schools need to be built over the next decade -- and if so, where. The area’s chief industry might boom or bust, in which case your population projections would have led you astray. Even so, the ethos of “solid modernity” was confident enough to regard contingency as a risk, but one with a margin of error you could try to anticipate.
Liquid modernity is more volatile than its predecessor, characterized by changes that don’t so much interact as cascade. Planning for the city’s educational needs would be less confident, the risks more complex and cumulative. Suppose, in the best of all worlds, economic good times come to the city, bringing an influx of population with it. But the very currents that brought them in might well send them back out again, and so the new residents may not feel enough connection to the place to regard property taxes as anything but an infringement of their human rights. Projecting public expenditures, if not impossible, then becomes something of a shot in the dark.
The curriculum was once stable enough for school systems to use the same textbooks for years on end. But no longer. Now it’s necessary to invest in educational hardware and software, in full knowledge of obsolescence as a problem. That could prove a more contentious issue than overcrowding. And so on.
Bauman doesn't use the school-board analogy I've made here, but it seems as good a way as any to show the implications of his thinking about education. The lesson of “the liquid modern world,” he writes, “is that nothing in that world is bound to last, let alone forever. Objects recommended today as useful and indispensable tend to ‘become history’ well before they have had time to settle down and turn into a need or habit…. Everything is born with the brand of imminent death and emerges from the production line with a ‘use-by date’ printed or presumed. The construction of new buildings does not start unless their duration is fixed or it is made easy to terminate them on demand…. A spectre hovers over the denizens of the liquid modern world and all their labours and creations: the spectre of superfluity.”
The liquification, if that’s how to put it, affects not just infrastructure but the very goals of education. Bauman writes that “the unbounded expansion of every and any form of higher education” in recent decades was driven by the value of certification in pursuing “plum jobs, prosperity, and glory,” with the “volume of rewards steadily arising to match the steadily expanding ranks of degree holders.”
A chance at upward mobility will not be a motivation again anytime soon. Mazzeo refers to the growing rank of what are called, in Britain, neets: young people “not in education, employment, or training.” At the same time, liquid modernity eats away at the long-established precondition of education itself: the expectation that, by acquiring certain fixed skills and established forms of knowledge, the student is receiving something of durable value. But durability is not a value in liquid modernity.
Almost everything Bauman says about education will be only too familiar to the sort of reader likely to pick it up in the first place. But his knack for placing things in context and accounting for that uneasy feeling you get from this or that current development makes it stimulating.
Bauman is prone to leaping from trend to totality in a single bound, and he doesn’t always quite make it. “Few if any commitments last long enough to reach the point of no return,” he writes, “and it is only by accident that decisions, all deemed to be binding only ‘for the time being,’ stay in force.” This is an example of Bauman the sage turning into Bauman the scold, of overgeneralization raised to the power of crankiness.
True, the fluids of digital hyper-ephemerality were saturating human relationships even before Mark Zuckerberg came on the scene. But the word “friend” does still have meaning, in some offline contexts anyway. To read that you are able to keep commitments or to follow through on decisions “only by accident” is considerably more insulting than there is any reason to suppose Bauman intends.
After a decade of reading papers and attending panels on the Crisis in Scholarly Publishing (it feels established and official enough now to deserve capital letters) I’m dubious about the prospect of ever writing another column on the topic. It starts to feel like Chevy Chase interrupting with a bulletin that Generalissimo Francisco Franco is, in fact, still dead.
Scholarly publishing isn’t dead, of course -- although at this stage, as with the Generalissimo, a major reversal of fortunes would appear unlikely. Ian Maclean’s Scholarship, Commerce, Religion: The Learned Book in the Age of Confessions, 1560-1630 (Harvard University Press) evokes a publishing world so different from the 21st century’s that visiting it seems like a vacation from today’s too-familiar circumstances.
Maclean, a professor of Renaissance studies at the University of Oxford, identifies the period covered by his study (which started out as a series of lectures at Oxford) as the late Renaissance. Maybe so. Clearly publishers were catering to a much-expanded audience that had acquired a taste for humane letters. A stable of freelance philologists cranked out new editions of ancient works, as well as translations. The public able to parse a page of Attic text was much smaller than that reading Latin, there was still a demand for books in Greek -- if only as a kind of erudite furniture, or for use as an implied credential. You imagine someone going a doctor or lawyer for the first time and spying the volume of Aristotle open on his desk, then thinking, “Wow, this guy must be good.”
But the big money, it sounds like, was in controversy – in pamphlets and collections of documents from the combat between Roman Catholics and Protestants, and between Protestants and one another. Theological argument in the era of Luther, Calvin, and Erasmus sought to bring the reader to the one true faith, but the now the line between polemic and character assassination had been blurred beyond recognition. If, say, a Calvinist scholar went over to the Vatican’s side, it was fair game for ex-colleagues to embarrass the apostate by publishing a volume of letters he’d written mocking Catholicism.
And -- more to the point – such a book would sell. The fracturing of the public along religious lines divided the publishing world into distinct “confessionalized” sectors -- each demanding its own editions of scripture, of course, but also of patristic and writings, and of historically significant documents backing up its claims to be the one true faith. Technology rendered the mass-production of books possible, while theology made it urgent.
Learned books in this period “fell into two broad classes,” Maclean explains: “textbooks for schools and universities, on the one hand, and more specialized humanist editions, historical, legal, theological, and medical works on the other.” The publishers themselves didn’t fall into corresponding categories; most did some of each.
Nor was the connection between scholarly publishing and academe all that close – least of all geographically. “While it was recognized that printing shops were a sign of the health of a country’s scholarship as much as the institutions of higher learning,” says Maclean, “this does not seem to have weighed much in their location.” Most presses “were based in cities without universities, and a surprising number of university cities were without printers who could compose in ancient languages.” Proximity to a university counted far less than the availability of raw material and skilled labor, not to mention access to trade routes and a strong patron.
Place of publication was also metadata: it signaled what religion confession it reflected, depending on which faith the authorities there favored. But the city indicated on a title page might or might not tell you where it was actually printed. Someone in Geneva publishing an anti-Calvinist pamphlet would have good reason to claim it came from Venice.
Besides textbooks, there was another sort of publishing aimed at the student market: editions of notes taken during the lectures for certain courses. Maclean says that a reputable publisher would clear this with the professor. That suggests, by implication, that shadier operations didn’t. (As far as I know, this practice was still going strong through at least the 19th century. We have information about some of Kant’s lectures thanks to publishers serving the needs of undergraduates who couldn’t make it to class.)
The scholarly publisher of the early 16th century was likely to be something of a Renaissance humanist himself, playing a role of servant to “the new learning.” Drawing on publishers’ catalogues, reports of the Frankfurt book fair (where the number of titles more than doubled between 1593 and 1613) and the records of titles found in scholars’ libraries following their deaths, Maclean recreates something of the prevailing routines and difficulties of scholarly publishing in this era.
The correspondence between publishers and authors (and the grumbling of each to third parties about delayed manuscripts or shoddy workmanship) are a reminder of the micropolitics of intellectual reputation in the days when getting work into print was considerably more difficult than it would soon become. But scholarly publishing was not at all a matter of academic credentialing. “No one in the late Renaissance obtained professional validation in a university through publication with distinguished publishers or in reputed publications as is done now,” writes Maclean. “The pressure scholars felt to achieve publication, if it did not arise from their desire to promote themselves and their subject, was rhetorically attributed to their patron, whose prestige they enhanced….”
The nobility of scholarship, then, depended on the scholarship of the nobility. But over time the publishing field was overtaken by “a new breed of entrepreneurs who were not so much involved in the production of knowledge as its marketing.” Every book is, after all, something of a gamble: the investment in publishing it involves risk, and the skills required to identify a valuable work of scholarship are distinct from those of keeping the enterprise solvent. More and more publishers entered the field, publishing more and more material; and for a long time things continued more or less profitably, in spite of the wars and plagues and whatnot.
By the 1590s, a satirist was complaining about the flood of shoddy material: Publishers were more interested in best-sellers than in serious scholarship. Volumes went to market as the revised, expanded, corrected edition of some work, even though the only thing new about it was the title page. Hacks were turning out commentaries on commentaries, and worse, people were buying them, just to add them to their collections.
Things were not, in short, like the good old days. On the other hand, neither were they as stable as they appeared for quite a while. The capacity for mass producing books developed more rapidly than market of readers could absorb them (or at least buy them). The bubble started to deflate in various fields in the the early 17th century. In 1610 you’d be be turning out treatises as fast as they could be typeset, which only meant that by 1620 you had a warehouse full of stuff in neo-Latin that nobody wanted to read.
Not to say that the Crisis in Scholarly Publishing has been going on for 400 years. Things bounced back at some point. Maclean does not say when, or how. But whatever happened after 1630 had to be a mutation, rather than just a market correction: a huge restructuring of institutions and of fields knowledge, to say nothing of the changes in what and how people read, and why. The expansion of readership preferring work in the vernacular was undoubtedly a factor, but was it sufficient?
Perhaps Maclean will pursue the matter in another book. On the strength of Scholarship, Commerce, and Religion, I certainly hope so.
My intention here is to say something about the microwave burrito, considered in its socio-cultural aspect. All in due course. But first, a quick detour into Germany in the 1840s, when Ludwig Feuerbach was undoubtedly the hot philosopher of the hour. A disciple declared that intellectual life had to pass through the “fiery brook” of Feuerbach’s thinking -- a pun on his name, which also meant "fiery brook."
I’d guess the play on words also involved a mildly sacrilegious joke about baptism. In his best-known work, The Essence of Christianity, Feuerbach made the provocative and career-destroying argument that God was, in effect, humanity writ large. Religion was an alienated expression of our intellectual and emotional capacities, as stunted and deformed by existing social arrangements. We project our highest powers and aspirations into a higher being, then rationalize anything oppressive as the work of divine will. This was more subtle than just an argument about God's existence. It was, in effect, a statement that humanity didn't exist yet -- not fully, at least.
Feuerbach had been a student of Hegel, whose seemingly closed and orderly philosophical system proved such a comfort to the Prussian bureaucracy. You could make a pretty good career out of demonstrating just how tidy that system was, and how it meant that every institution that existed had its purpose. Feuerbach worked out his own ideas by pushing Hegel’s in a more radical direction, thereby effectively thinking himself out of a job. The academic blacklisting, combined with a certain literary flair, made Feuerbach the hero of young intellectuals in Germany, and The Essence of Christianity hit British bookstores in 1855 in a translation by one Marian Evans, later and much better known for the novels she published as George Eliot.
Even so, chances are that Feuerbach would be completely forgotten if not for a few pages jotted down in his notebook by Karl Marx under the title “Theses on Feuerbach.” (Marx had also made that "fiery brook" quip.) The eleventh and final thesis – “The philosophers have hitherto only interpreted the world; the point, however, is to change it” – has been used by generations of young activists to try to get their professors to do more than pontificate about social problems. Feuerbach himself became active for a while, when the wave of revolutions sweeping through Europe in 1848 hit Germany. But he was a reclusive man by temperament and after a while largely withdrew from public life, let alone politics.
The microwave burrito, as you may have surmised, is only incidentally linked to German philosophy. Still, we're getting there. For it was in 1850 that Feuerbach’s former student Jacob Moleschott published a book called The Theory of Nutrition, which he sent to the philosopher in hopes of drumming up some publicity. (It’s always touching, the way friends will volunteer to let you review their books.)
The essay that Feuerbach wrote on Moleschott was strange, but it contained a turn of phrase that outlived both of them -- one that is, in fact, known to everyone. Here is the crucial passage:
"Foodstuffs become blood; blood becomes heart and brain, the stuff of thought and attitudes. Human fare is the basis of human education and attitudes. If you want to improve the people give it, instead of homilies against sin, better food. Man is what he eats."
You are what you eat. The motto that inspired a thousand infomercials was coined in an essay completely forgotten by everyone except Feuerbach scholars, who are not exactly thick on the ground. They have interpreted it in a surprising range of ways. It can be taken as a serious continuation of Feuerbach’s earlier work. Or as a satire, possibly inspired by reading the comedies of Aristophanes. Or as evidence that the poor man was losing it – philosophically, at least, if not mentally. He was depressed about the failure of the revolution, that much is clear. He suggested that it was a consequence of diet: Germans ate too much cabbage and potatoes, which provided insufficient protein for the brain. Progress demanded that they consume more beans.
Forget economic determinism; this is nutritional determinism. “Sustenance,” he writes in another passage, “is the beginning of consciousness. The first condition for bringing something to your head and your heart is bringing something to your stomach.” A fair point, no matter how seriously or jokingly Feuerbach meant it (a bit of both, I imagine). The recent abundance of interdisciplinary scholarship on food suggests that he was something of a prophet. If his work from the 1840s transformed theology into philosophical anthropology, the late phase of Feuerbach’s work might serve to ground the humanities in food studies.
He makes no appearance in Harvey Levenstein’s Fear of Food: A History of Why We Worry about What We Eat (University of Chicago Press), where the endnotes tend to cite things like “Colonic Irrigation and the Theory of Autointoxication” from the Journal of Clinical Gastroenterology. But it raises Feuerbachian questions, even so. If we are what we eat, then what does it mean when we become afraid of something we might have eaten happily the day before?
Levenstein, a professor emeritus at McMaster University in Ontario, writes in straightforward narrative prose about the waves of anxiety about food that have swept across the United States from the 1890s until the present day, from the menace posed by fresh fruit and vegetables (since flies landed on them in open-air markets) to lipophobia (with any consumption of high-fat foods regarded as a form of suicidal behavior). It's a well-researched but also very diverting book, with a large cast of public benefactors and corrupt operators. Not that you can always tell them apart.
In passing, the author mentions a chemically disinfected pulpy meat byproduct called “pink slime,” often incorporated into hamburger, among other comestibles. When Fear of Food arrived in galleys a few months ago, you didn't hear much about pink slime. In March, a scientist who once worked at the Department of Agriculture told an interviewer on network television that up to 70 percent of the ground beef in American supermarkets contained pink slime. Social media did the rest, forcing at least one company to shut down three plants and another to file for bankruptcy protection. A press release from the American Meat Institute has just announced “the addition of a new summit on Lean Finely Textured Beef” at a major food-marketing trade show next month. ("Lean Finely Textured Beef" is the term AMI would prefer everyone use instead of "pink slime." I will venture to guess that is not going to happen.)
The publication of Fear of Food had nothing whatever to do with the public gorge becoming suddenly buoyant. But neither is it purely a matter of synchronicity. “The agricultural revolution allowed humans to grow foods that they knew were safe,” writes Levinson, “but the market economy that accompanied it brought new worries: unscrupulous middlemen could increase their profits by adulterating food with dangerous substances. The new ways of producing, preserving, and transporting foods that arose in the nineteenth century heightened these fears by widening the gap between those who produced foods and those who consumed them.”
And that gap widened still more in the 20th century as doctors, scientists, corporations, regulatory agencies, and advertisers intervened, along with the occasional huckster or food-fadist. Public alarm over contamination or adulteration can be well-founded. (The recall of 143 million pounds of beef from a particularly vile feedlot a few years ago is a case in point.) But the waves of concern also manifest what, in an earlier book, Levinstein called “the paradox of plenty”: Americans are “a people surrounded by abundance who are unable to enjoy it.” That is something of an overstatement, though the portrait is recognizable. We must like to worry because we’re so good at it.
The incredible range of foodstuffs, and the conflicting health claims about what to eat and what to avoid, create “a kind of gastro-anomie,” writes Levenstein, “a condition in which people have no sense of dietary norms or rules.” That diagnosis seems to fit. The United States is a country where you can buy both soda with no calories and pizza with a crust stuffed with extra cheese. What’s more, you can buy them at the same place, at the same time. That's about as anomic as it gets. A public scare or dietary fad at least imposes a kind of temporary norm, thereby keeping the chaos at bay. And so, Levenstein implies, we’ll keep having them.
Which brings us, at long last, to the microwave burrito. If someone had to come up with a foodstuff to epitomize “gastro-anomie,” I'm pretty sure this one would do the trick. Certainly Levenstein’s point about the distance and disconnection between producer and consumer would apply. It seems entirely possible that the burrito remains untouched by human hands throughout the long journey from its creation to your grocer's freezer.
The packaging insists that it is healthy – the one I am looking at does, anyway. For one thing it has little or no cholesterol. Feuerbach recommended eating beans, as you may recall, so that's covered. He also quoted Moleschott as saying that there was no human thought without phosphorus. The nutritional information does not say just much of the minimum daily requirement of phosphorus is met. But it has lots of protein, despite being meatless. The primary selling point, of course, is that it's convenient. I often have one for lunch or dinner while writing this column, for precisely that reason. You throw it on a plate, nuke it, pay almost no attention while eating, then forget it.
“As is the food, so is the being," interrupts Feuerbach at this point. "As is the being, so is the food. Everyone eats only what is in accord with his individuality or nature, his age, his sex, his social position and profession, his worth. Every class is what it eats according to its essential uniqueness and vice versa.”
I find the remark troubling. Who wants to think of a burrito in the microwave as the deepest foundation, and fullest expression, of his innermost being? Plus, it tastes better salted. "As of this writing," Levenstein notes, "we are told that salt, historically regarded as absolutely essential to human existence, is swinging the grim reaper's scythe." A convenient meal is hurtling me towards nothingness! Then again, it contains no Lean Finely Textured Beef, which is a comfort.