My intention here is to say something about the microwave burrito, considered in its socio-cultural aspect. All in due course. But first, a quick detour into Germany in the 1840s, when Ludwig Feuerbach was undoubtedly the hot philosopher of the hour. A disciple declared that intellectual life had to pass through the “fiery brook” of Feuerbach’s thinking -- a pun on his name, which also meant "fiery brook."
I’d guess the play on words also involved a mildly sacrilegious joke about baptism. In his best-known work, The Essence of Christianity, Feuerbach made the provocative and career-destroying argument that God was, in effect, humanity writ large. Religion was an alienated expression of our intellectual and emotional capacities, as stunted and deformed by existing social arrangements. We project our highest powers and aspirations into a higher being, then rationalize anything oppressive as the work of divine will. This was more subtle than just an argument about God's existence. It was, in effect, a statement that humanity didn't exist yet -- not fully, at least.
Feuerbach had been a student of Hegel, whose seemingly closed and orderly philosophical system proved such a comfort to the Prussian bureaucracy. You could make a pretty good career out of demonstrating just how tidy that system was, and how it meant that every institution that existed had its purpose. Feuerbach worked out his own ideas by pushing Hegel’s in a more radical direction, thereby effectively thinking himself out of a job. The academic blacklisting, combined with a certain literary flair, made Feuerbach the hero of young intellectuals in Germany, and The Essence of Christianity hit British bookstores in 1855 in a translation by one Marian Evans, later and much better known for the novels she published as George Eliot.
Even so, chances are that Feuerbach would be completely forgotten if not for a few pages jotted down in his notebook by Karl Marx under the title “Theses on Feuerbach.” (Marx had also made that "fiery brook" quip.) The eleventh and final thesis – “The philosophers have hitherto only interpreted the world; the point, however, is to change it” – has been used by generations of young activists to try to get their professors to do more than pontificate about social problems. Feuerbach himself became active for a while, when the wave of revolutions sweeping through Europe in 1848 hit Germany. But he was a reclusive man by temperament and after a while largely withdrew from public life, let alone politics.
The microwave burrito, as you may have surmised, is only incidentally linked to German philosophy. Still, we're getting there. For it was in 1850 that Feuerbach’s former student Jacob Moleschott published a book called The Theory of Nutrition, which he sent to the philosopher in hopes of drumming up some publicity. (It’s always touching, the way friends will volunteer to let you review their books.)
The essay that Feuerbach wrote on Moleschott was strange, but it contained a turn of phrase that outlived both of them -- one that is, in fact, known to everyone. Here is the crucial passage:
"Foodstuffs become blood; blood becomes heart and brain, the stuff of thought and attitudes. Human fare is the basis of human education and attitudes. If you want to improve the people give it, instead of homilies against sin, better food. Man is what he eats."
You are what you eat. The motto that inspired a thousand infomercials was coined in an essay completely forgotten by everyone except Feuerbach scholars, who are not exactly thick on the ground. They have interpreted it in a surprising range of ways. It can be taken as a serious continuation of Feuerbach’s earlier work. Or as a satire, possibly inspired by reading the comedies of Aristophanes. Or as evidence that the poor man was losing it – philosophically, at least, if not mentally. He was depressed about the failure of the revolution, that much is clear. He suggested that it was a consequence of diet: Germans ate too much cabbage and potatoes, which provided insufficient protein for the brain. Progress demanded that they consume more beans.
Forget economic determinism; this is nutritional determinism. “Sustenance,” he writes in another passage, “is the beginning of consciousness. The first condition for bringing something to your head and your heart is bringing something to your stomach.” A fair point, no matter how seriously or jokingly Feuerbach meant it (a bit of both, I imagine). The recent abundance of interdisciplinary scholarship on food suggests that he was something of a prophet. If his work from the 1840s transformed theology into philosophical anthropology, the late phase of Feuerbach’s work might serve to ground the humanities in food studies.
He makes no appearance in Harvey Levenstein’s Fear of Food: A History of Why We Worry about What We Eat (University of Chicago Press), where the endnotes tend to cite things like “Colonic Irrigation and the Theory of Autointoxication” from the Journal of Clinical Gastroenterology. But it raises Feuerbachian questions, even so. If we are what we eat, then what does it mean when we become afraid of something we might have eaten happily the day before?
Levenstein, a professor emeritus at McMaster University in Ontario, writes in straightforward narrative prose about the waves of anxiety about food that have swept across the United States from the 1890s until the present day, from the menace posed by fresh fruit and vegetables (since flies landed on them in open-air markets) to lipophobia (with any consumption of high-fat foods regarded as a form of suicidal behavior). It's a well-researched but also very diverting book, with a large cast of public benefactors and corrupt operators. Not that you can always tell them apart.
In passing, the author mentions a chemically disinfected pulpy meat byproduct called “pink slime,” often incorporated into hamburger, among other comestibles. When Fear of Food arrived in galleys a few months ago, you didn't hear much about pink slime. In March, a scientist who once worked at the Department of Agriculture told an interviewer on network television that up to 70 percent of the ground beef in American supermarkets contained pink slime. Social media did the rest, forcing at least one company to shut down three plants and another to file for bankruptcy protection. A press release from the American Meat Institute has just announced “the addition of a new summit on Lean Finely Textured Beef” at a major food-marketing trade show next month. ("Lean Finely Textured Beef" is the term AMI would prefer everyone use instead of "pink slime." I will venture to guess that is not going to happen.)
The publication of Fear of Food had nothing whatever to do with the public gorge becoming suddenly buoyant. But neither is it purely a matter of synchronicity. “The agricultural revolution allowed humans to grow foods that they knew were safe,” writes Levinson, “but the market economy that accompanied it brought new worries: unscrupulous middlemen could increase their profits by adulterating food with dangerous substances. The new ways of producing, preserving, and transporting foods that arose in the nineteenth century heightened these fears by widening the gap between those who produced foods and those who consumed them.”
And that gap widened still more in the 20th century as doctors, scientists, corporations, regulatory agencies, and advertisers intervened, along with the occasional huckster or food-fadist. Public alarm over contamination or adulteration can be well-founded. (The recall of 143 million pounds of beef from a particularly vile feedlot a few years ago is a case in point.) But the waves of concern also manifest what, in an earlier book, Levinstein called “the paradox of plenty”: Americans are “a people surrounded by abundance who are unable to enjoy it.” That is something of an overstatement, though the portrait is recognizable. We must like to worry because we’re so good at it.
The incredible range of foodstuffs, and the conflicting health claims about what to eat and what to avoid, create “a kind of gastro-anomie,” writes Levenstein, “a condition in which people have no sense of dietary norms or rules.” That diagnosis seems to fit. The United States is a country where you can buy both soda with no calories and pizza with a crust stuffed with extra cheese. What’s more, you can buy them at the same place, at the same time. That's about as anomic as it gets. A public scare or dietary fad at least imposes a kind of temporary norm, thereby keeping the chaos at bay. And so, Levenstein implies, we’ll keep having them.
Which brings us, at long last, to the microwave burrito. If someone had to come up with a foodstuff to epitomize “gastro-anomie,” I'm pretty sure this one would do the trick. Certainly Levenstein’s point about the distance and disconnection between producer and consumer would apply. It seems entirely possible that the burrito remains untouched by human hands throughout the long journey from its creation to your grocer's freezer.
The packaging insists that it is healthy – the one I am looking at does, anyway. For one thing it has little or no cholesterol. Feuerbach recommended eating beans, as you may recall, so that's covered. He also quoted Moleschott as saying that there was no human thought without phosphorus. The nutritional information does not say just much of the minimum daily requirement of phosphorus is met. But it has lots of protein, despite being meatless. The primary selling point, of course, is that it's convenient. I often have one for lunch or dinner while writing this column, for precisely that reason. You throw it on a plate, nuke it, pay almost no attention while eating, then forget it.
“As is the food, so is the being," interrupts Feuerbach at this point. "As is the being, so is the food. Everyone eats only what is in accord with his individuality or nature, his age, his sex, his social position and profession, his worth. Every class is what it eats according to its essential uniqueness and vice versa.”
I find the remark troubling. Who wants to think of a burrito in the microwave as the deepest foundation, and fullest expression, of his innermost being? Plus, it tastes better salted. "As of this writing," Levenstein notes, "we are told that salt, historically regarded as absolutely essential to human existence, is swinging the grim reaper's scythe." A convenient meal is hurtling me towards nothingness! Then again, it contains no Lean Finely Textured Beef, which is a comfort.
An old rule of etiquette -- still endorsed by Miss Manners, at last report -- says not to talk about politics or religion while in mixed company, or among strangers. Civility demands keeping the passions in check, and nothing inflames them like those two topics. By extension, one should also avoid discussing Thomas Kinkade, who died over the weekend. His paintings of lighthouses, cozy cottages, and nostalgia-tinged city streets inspire adoration or disgust, but very little in between.
Kinkade was the single best-known artist working in the United States over the past two decades, and almost certainly the best-paid. At the peak of his career in the late 1990s and early ‘00s, he was earning more than $7 million per year. Besides paintings and prints, the Kinkade brand (he used the term himself) includes towels, mugs, clocks, calendars, and La-Z-Boy recliners. His claim that one American home in 20 contains some Kinkadean product or other seems inflated, though not altogether impossible.
Even stating these seemingly inoffensive facts will offend some readers -- either for calling Kinkade an artist (which makes people in the art world unhappy) or for failing to say that he dedicated his life to the Lord, not the dollar. I am in no position to judge that claim, but clearly it will be necessary to watch my step from this point on. Expressing a personal opinion of Kinkade in this column is of little interest to me (suffice it to say I’m more of a Gerhard Richter man), but the intensity of response to his work certainly is.
In a culture supersaturated with imagery, we tune much of it out just to get by. Kinkade’s images are exceptional. They elicit not just a verbal but a somatic response: a heartwarming feeling or visceral loathing. Why? How?
There’s no accounting for taste, as another old saw runs. But for a number of contributors to Thomas Kinkade: The Artist in the Mall -- edited by Alexis L. Boylan and published last year by Duke University Press -- accounting for the late artist’s appeal is not difficult at all. The Painter of Light (he trademarked the phrase) was, in the title of Micki McElya’s essay, “Painter of the Right.” The world Kinkade portrays is, if not prelapsarian, at least pre-1960s: “unmarked by the civil rights movement, feminism, gay liberation, or the Vietnam War,” writes McElya, “suggesting instead the mythical, simpler youths and ‘Good War’ of the ‘Greatest Generation.’ ”
Seth Ferman makes an overlapping argument in “God is in the Retails: Thomas Kinkade and Market Piety.” The paintings and the incredible array of products reproducing them express the desire for a world untouched by corrosive modernity -- but that’s just the half of it. They also serve a kind of sacramental purpose: communion via commodity.
“Kinkade fuses elements of Christian orthodoxy and capitalist ideology into a single faith,” Ferman writes, “what I call market piety, a veritable theology that believes free-market consumerism to be numinous…. Through Kinkade the consumption of art becomes a religiously meaningful way to transcend the difficulties of modern life (which ironically includes consumerism), making his hybrid market piety into an inconspicuous yet pervasive cultural identity for many of his collectors.”
His bucolic landscapes, then, are so many battlefields: the sites of culture-war skirmishing between “red” and “blue” sensibilities, fought out in an especially fierce way. A painting called "Hometown Memories I: Walking to Church on a Rainy Sunday Evening," taunts the presumed cultural elite with its very title, and to reliable effect. In her essay “Purchasing Paradise: Nostalgic Longing and the Painter of Light,” Andrea Wolk Rager writes that "Hometown Memories" “does not make demands of the viewer,” as serious art presumably does. “Instead, it lures you, almost imperceptibly, into a world where memory, placid and pleasant, has been supplied for you. The warm glow, the feeling of comfortably enclosing space, and the sense of welcoming solace complete the process of soporific pacification.”
That description stops just short of using the word “pablum,” which reflects Wolk Rager’s emphasis on the psychoanalytic understanding of nostalgia as a desire to return to the security and bliss of infantile fusion with the mother. The spaces depicted in Kinkade’s work “are often wet and warm, slick with spring rain and soft with diffused light. The images are dominated by curving lines and framing devices that seem to close in around a protected center. One is given the sense of being cushioned and cradled and lulled.”
A womb with a view, then. By this point, any Kinkade enthusiasts still reading will probably consider the book to be an assault, and not just on the painter but on themselves. Interpretation can be an aggressive act. But not all of the essays are interrogations, and I want to recommend one in particular as a counterstatement.
In “Thomas Kinkade’s Heaven on Earth,” the performance artist Jeffrey Vallance writes about curating “the first-ever contemporary art world exhibition of the works of Thomas Kinkade” in 2004, conducted simultaneously at the gallery of California State University at Fullerton and the Grand Central Arts Center at Santa Ana, nearby. If looking at Kinkade’s paintings through Freudo-Marxian goggles seems perverse to his admirers, showing them in a museum setting horrified the art world.
“Some people will never forgive me,” Vallance writes. “They fear his existence. He threatens everything they stand for, and he makes them nauseous.” There were pickets and black armbands. Someone threatened to slash the paintings. It cannot have helped that the exhibit included one artifact each from the extensive line of tie-in products, including the official Kinkade Visa card, “displayed in a vitrine resting on a velvet pillow.”
Sometimes art is provocative, and sometimes a provocation is an art. “Many erroneously thought that I would do the show in an ironic way,” the curator writes. “For me, irony is far too simplistic and expected. To do the show seriously was the challenge. As I often say, ‘The only irony is there is no irony.’ ”
Kinkade aficionados loved the exhibit, while the art critics were overwhelmed. “Many reviewers of the show followed a similar pattern,” Vallance recalls. “Most writers pretty much admitted that they loathed Kinkade and came expecting to hate the show – like gawkers at a train wreck. But then something happened. When they came to see the actual show, the kitsch was laid on so thick that something snapped in their brains. They experienced transcendence and ended up liking the show.”
And like it or not, any painter who can compel other artists to wear black armbands in protest of his work has already called dibs on posterity.
Everyone gets rejected. And it never stops being painful not matter how successful or how long you have been in the business. Some of this is inevitable; not everyone is above average. But some of it isn't. I thought that I would offer some dos and don’ts for reviewers out there to improve the process and save some hurt feelings, when possible. Some are drawn from personal experience; others, more vicariously. I have done some of the "don’ts" myself, but I feel bad about it. Learn from my mistakes.
(Author's note: I'd like people to focus on the ideas in this piece, not the strong language, so I've substituted a new version with all the same points, but a few different words.)
First, and I can’t stress this enough, READ THE PAPER. It is considered impolite by authors to reject a paper by falsely accusing it of doing THE EXACT OPPOSITE of what it does. Granted, some people have less of a way with words than others and are not exactly clear in their argumentation. But if you are illiterate, you owe it to the author to tell the editors when they solicit your review. It is O.K. – there are very successful remedial programs they can recommend. Don’t be ashamed.
Second, and related to the first, remember the stakes for the author. Let us consider this hypothetical scenario. In a safe estimate, an article in a really top journal will probably merit a 2-3 percent raise for the author. Say that is somewhere around $2,000. Given that salaries (except in the University of California System) tend to either stay the same or increase, for an author who has, say, 20 years left in his/her career, getting that article accepted is worth about $40,000 dollars. And that is conservative. So you owe it more than a quick scan while you are on the can. It might not be good, but make sure. Do your job or don’t accept the assignment in the first place. (Sorry, I don’t usually like scatological humor but I think this is literally the case sometimes.)
Third, the author gets to choose what he/she writes about. Not you. He/she is a big boy/girl. Do not reject papers because they should have been on a different topic, in your estimation. Find fault with the the paper actually under review to justify your rejection.
Fourth, don’t be petty and whiny. Articles should be rejected based on faulty theory or fatally flawed empirics, not a collection of little cuts. Bitchy grounds include but are not limited to – not citing you, using methods you do not understand but do not bother to learn, lack of generalizability when theory and empirics are otherwise sound. The bitchiness of reviews should be inversely related to the audacity and originality of the manuscript. People trying to do big, new things should be given more leeway to make their case than those reinventing the wheel.
Fifth, don’t be a jerk. Keep your sarcasm to yourself. Someone worked very hard on this paper, even if he/she might not be very bright. Writing “What a surprise!”, facetiously, is not a cool move. Rejections are painful enough. You don’t have to pour salt on the wound. Show some respect.
Sixth, remember that to say anything remotely interesting in 12,000 words is ALMOST IMPOSSIBLE. Therefore the reviewer needs to be sympathetic that the author might be able to fix certain problems when he/she is given more space to do so. Not including a counterargument from your 1986 journal article might not be a fatal oversight; it might have just been an economic decision. If you have other things that you would need to see to accept an otherwise interesting paper, the proper decision is an R&R, not a reject. Save these complaints for your reviews of full-length book manuscripts where they are more justifiable.
Seventh, you are not a film critic. Rejections must be accompanied by something with more intellectual merit than "the paper did not grab me" or "I do not consider this to be of sufficient importance to merit publication in a journal of this quality." This must be JUSTIFIED. You should explain your judgment, even if it is something to the effect of, "Micronesia is an extremely small place and its military reforms are not of much consequence to the fate of world politics." Even if it is that obvious, and it never is, you owe an explanation.
Brian C. Rathbun is associate professor in the School of International Relations at the University of Southern California. This essay is adapted from a blog post by Rathbun at The Duck of Minerva.
Recently I received an e-mail that prompted me to think once again about commensuration -- the social process of providing meaning to measurement. The study of commensuration involves analyzing the form and circulation of information and how counting changes the way that people attend to it, as discussed in articles by Wendy Espeland and Mitchell L. Stevens and Espeland and Michael Sauder.
The e-mail came from the editor of a special issue of an American journal in my field, concerned my contribution to the issue, and contained a recommendation based in current metrics governing the worth of ideas: "There is one thing I want to encourage you to consider doing, namely have a look at a couple of preliminary and relevant articles from other contributors to the special issue. If you acknowledge each other’s work it will clearly add to the feeling of having a special issue that is relatively well-integrated, plus add to the impact factor of each other’s work." He had dared to request out loud that we game the system, a practice generally discussed in whispers.
The editor is a particularly ambitious young man, who is bright, works hard, and wants to scale the rungs to the top of academe. There are lots of young academics who fit that description, but other non-tenured full-time faculty to whom I mentioned the e-mail were appalled. "You’re kidding," one said, as a look of disgust took over his face. A young woman to whom I forwarded the quote replied promptly: "That impact factor comment in the letter is a little depressing -- are we academics really that pathetic?"
Perhaps because I am a sociologist, that e-mail got me to thinking about the measurement of value in academe. (I had contemplated the politics of self-promotion previously, when another untenured researcher had asked me to "like" his work on Facebook.) Certainly the practice of measuring human value is not a new thing. Economists have long conflated wages with the measurement of human value, as writers from Karl Marx and Adam Smith to today’s neoliberals have clearly shown. (Smith was for conflation; Marx was against; the neoliberals don’t even know that such conflation can be challenged.)
When I was a kid in the 1950s, someone had calculated the worth of the chemicals in the human body -- $1.78. I remember being surprised that a body was worth so little instead of being shocked that someone had even performed the calculation. Today I’m not taken aback to learn on a website that someone has calculated “the lucrative uses for the roughly 130 pieces of body tissue that are extracted, sterilized, cut up, and put on the market” -- $80,000. As I age, I am becoming harder to shock. After all, there is a cadaver industry. At least three television dramas, "Law and Order: Special Victims Unit," "The Closer" and "The Mentalist" have reminded me that evildoers will plot to obtain body parts and will kill to make their way up the list of people awaiting transplants.
I don’t think I am naïve. I have heard discussions of impact factors before, mostly when people evaluate their colleagues’ scholarly contributions to decide whether they deserve promotion or tenure. Usually, the term refers to a metric that supposedly summarizes the worth of a journal, calculated by the number of citations per article that it has received either in other journals (as given by either the ISI Web of Knowledge or Scopus) or in books and journals (Google Scholar). There is some variation in how a journal scores, depending on which company is reporting. Scopus emphasizes science journals; ISI includes humanities and social science journals; Google Scholar adds books. The meaning of the metric is simple: the higher the score, the better the journal. It follows that the higher the impact factor of the journals in which a candidate publishes, the worthier the candidate. Thus, a candidate for tenure whose publications are all in journals with high impact is worthier than a candidate who publishes is lesser journals, all else being equal (though of course, it never is). I once heard a biologist praise a candidate for tenure, because he had published in a journal with an impact score of 4.5, which is quite good in most branches of biology and off the charts in the social sciences.
Impact scores affect subfields. Just as the top molecular biology journals have higher scores than the top environmental biology journals, so too within any one discipline, some specialties score higher than others. The more people work in a subfield or a specialty within that subfield, the higher the potential impact factor. Last year when Gender & Society, the journal of Sociologists for Women in Society, had the fourth highest score of the 132 journals in sociology, the organization’s list-serv celebrated. Several people looked forward to telling the news to colleagues who had poo-pooed the study of gender so that they could eat their words.
Impact scores also affect whole universities. Several years ago, top administrators at the University of Chile advised some professors to help improve the institution’s international ranking by publishing in “ISI journals.” (This is also an instruction to publish in English, since the Web of Knowledge is more likely to include English-language journals in their calculations than journals in other languages.) Already one of the top ten institutions of higher education in Latin America, this public university is locked in competition with the private Pontifical Catholic University of Chile.
And, of course, impact scores affect the journals being rated. Supposedly, given the choice of two journals that might accept her work, the canny professor will submit to the journal with the higher impact score. The more articles submitted, the more rejected, the better the articles published – or so the theory goes. Editors keep track of their journal’s score and publishers list the scores on their websites. Last year, like other members of one editorial board, I received a joyous e-mail announcing that journal’s impact factor and celebrating its relative achievement. By its fourth year, it had risen to the middle of the pack in its subfield.
To me, an agreement to cite one another’s work accepts the proposition that citations indicate the quality of an individual’s research. That theory receives concrete validation every time that the members of a promotion and tenure committee check how many citations a candidate has received. I’ve seen cases where committee members were so wedded to the measure that they could not hear that the candidate had received few citations because he was in an emerging field and also could not accept that members of such fields don’t score so well on impact measures. When enough people can attend a convention to discuss a supposedly nascent idea, Marshall McLuhan once said, that idea is no longer innovative. McLuhan might well have been discussing the circulation and impact factor of journals.
I find it worrisome that all of these uses of impact factors may shape a field. I've heard tell that after preparing a self-evaluation for a quintennial review of his department, one social science chair urged his colleagues to publish articles rather than books. Articles garner citations more quickly. If everyone published articles, he thought, the department would collect citations more quickly and so would zoom up the national rankings of the quality of departments in its field. The chair forgot to mention that in his discipline, journals tended to publish one kind of research and books, another. Perhaps he didn’t realize that he was essentially telling his colleagues what sort of scholarship they ought to do.
Unhappily, as I think about all of this measurement, I am forced to examine my own practices. It's just too easy to audit oneself and to confuse the resulting number with some form of self-worth. When Google Scholar announced that intellectuals could have access to their citation count, as well as their scores on the h and i10 indices, I first Googled the indices. (I found, "an h-index is the largest number h such that h publications have at least h citations." An "i10-index is the number of publications with at least 10 citations.") Google Scholar was also good enough to tell me the scores of a newly promoted professor and of a potential Noble laureate. Then I looked myself up. After several weeks I realized that by auditing my citation and indices much as I might check my weight, I had commodified myself – my worth to both my department and my university -- every bit as much as the cadaver industry has calculated the worth of my body parts.
I like to tell myself that checking my citation count is only a symbolic exercise in commensuration. After all, no one knows the exact financial worth of each citation of each scholar working at each research university, let alone for scholars in my discipline and subfields. In contrast, the cadaver industry is dealing in concrete dollars and cents. I find the discrepancy between these calculations comforting. I advise myself: since it is only symbolic, my self-audit does not yet qualify as commodification. As Marx might have put it, I have not yet paid so much attention to my product (published research) that I have confused the value of the product with the dignity of the maker. I care about that; I’m discussing my dignity.
But then I think again. My self-audit of my own citation count expresses obeisance to the accountability regime that increasingly governs higher education. (An accountability regime is a politics of surveillance, control and market management that disguises itself as value-neutral and scientific administration.) Sure, the young scholar who had sent me that e-mail advocating mutual citation felt he was advancing his career and protecting himself from failure. But I, too, have been speeding the transformation of higher education from an institution that stresses ideas to one that emphasizes measurement and marketability. I am ashamed to say that in this job market, I would feel hard-pressed to tell the young man to ignore his citations and just do his work.
Gaye Tuchman, professor emerita of sociology at the University of Connecticut, is author of Wannabe U: Inside the Corporate University and Making News: A Study in the Construction of Reality.