Essay on how academics can use small chunks of time


Nate Kreuter seeks more productivity by making better use of those groups of minutes between major tasks.

Job Tags: 
Ad keywords: 
Editorial Tags: 
Show on Jobs site: 
Image Size: 

Essay on the way higher education reformers misunderstand the role of professors

For a rising generation of administrators in higher education, the heart of education is innovative technology -- and faculty get in the way.

In a recent speech, the new president of Carnegie Mellon University, Subra Suresh, intimated his administrative philosophy, remarking that, “the French politician Georges Clemenceau once said that, ‘War is too important to be left to the generals.’ Some would argue learning is too important to be left to professors and teachers.”

The speech opened the inaugural meeting of the Global Learning Council (GLC), held at Carnegie Mellon in September. The GLC brings together a group of high-level university administrators, government officials, and corporate executives who aspire to be an at-large advisory group, akin to the National Research Council, for higher education.

Suresh could have used the help of an English professor to unpack the analogy. Presidents and provosts would be generals, not faculty, who are the soldiers in the trenches, so the fitting parallel would actually be “education is too important to be left to administrators.”

On that count, I agree.

Suresh’s phrasing was not a slip but a frank statement — for him, faculty have little place in decision-making. And I think that it captures the leaning of many current initiatives touting innovation and technology.

The classic definition of the university is that it represents the corporate body of the faculty. Like the protagonist of Flannery O’Connor’s Wise Blood, who wants to establish the Church of Christ without Christ, the New Leaders of higher education want to establish education without educators. Or more precisely, they want to call the shots and faculty to do what they're told, like proper employees. To wit, at the conference there were few regular faculty member in attendance (even if some of the administrators had started as or occasionally did guest spots as professors, it’s probably been a while since they devoted much of their work time to that realm), and there was certainly no social or cultural critic of higher education scheduled to speak. Rather than engaging much criticism or debate — which, after all, is a mission of the university, testing ideas — it had the character more of an infomercial.

The focus of the conference was to install technology in higher education as fast as possible, and the speakers included high-level figures from Google, Kaplan, edX, and various other companies with a financial interest in the changeover.

The only speaker who raised doubts about technology was a military person, Frank C. DiGiovanni, director of force readiness and training in the U.S. Office of the Undersecretary of Defense. In his talk he said that he found that, to be effective, education needs to “stimulate the five senses,” which does not happen with devices. In fact, he noted that there was a “loss of humanity” with them. He added in subsequent discussion: “I worry about technology taking over. The center of gravity is the human mind.”

It seemed a little ironic to me that the only person reminding us of a humanistic perspective was the military man, though it was clear that DiGiovanni had a good deal of experience with how people actually learned and that he cared about it.

The innovation mantra has been most prominently expressed by the business guru Clayton Christensen, who coined the phrase “disruptive innovation.” It has been the credo especially of tech companies, who come out with ever-new products each year. The theory is that businesses like the American steel industry have failed because they were set in their ways, doing things that were successful before. Instead, even if successful, they should disrupt what they’re doing. Hence, while Apple was making very good laptops, they went to the iPhone. Then to the iPad. Then to the Apple Watch.

Christensen has extended his theory to academe, in articles and his 2011 book, The Innovative University: Changing the DNA of Higher Education from the Inside Out (co-written with Henry Eyring). He basically sees higher education as set in its ways (hence the DNA metaphor) and ripe for a takeover by technology, and he holds up universities such as BYU-Idaho and the for-profit DeVry University as models for the future. He admits that Harvard University is still top of the line, but not everyone can go to Harvard, so, in cheery rhetoric (some of which is taken from the promotional literature of the colleges themselves), he sees these other schools doing what Walmart did to retail.

Christensen’s theory of innovation has been rebutted by Jill Lepore in a recent piece in The New Yorker, “The Disruption Machine.” She points out that most companies succeed because of sustainable innovation, not disruptive. Apple, after all, still makes laptops, and US Steel is still the largest steel company in the US. In addition, she goes on to demonstrate that a good deal of Christensen’s evidence is thin, not to mention that many of his examples of success have gone belly-up.

Besides holes in the general theory, it’s also questionable whether the kind of innovation that applies to technological or commodity products is readily translatable to education. Cognitivists have shown that education largely works affectively, through empathy, which requires live people in front of you. One learns by imaginatively inhabiting another’s point of view.

Moreover, most institutions of higher education have a different role than businesses — more like churches, which in fact is the analogy that helped establish their independent legal status in the 1819 Dartmouth decision of the U.S. Supreme Court. Something other than consuming goes on at universities, which gets lost in the ommercial model of higher ed.

Think of it this way: while I like to shop at Macy’s and hope it stays in business, I would not donate any money to it, whereas I have to universities and churches. Of course universities should use best business practices, but if they act primarily as a business, with a saleable product and positioning students as customers, then they abnegate this other role. This is an inherent contradiction that vexes the push to commercialize higher education.

This is not to say that there is no use for technology. The Online Learning Initiative, a project studying statistics pedagogy at Carnegie Mellon, shows that some online segments work better than large lecture sessions. But, if you read their reports, it’s clear that the experiment essentially offers a flipped classroom, and in fact students probably gain more faculty contact than in the lecture model. It’s more like a return to a tutorial model. Who knew students do better with professors?

What the rush for innovation is really about, as Christopher Newfield, a leading critic of higher education, has pointed out, is not a better theory of change but a theory of governance. As Newfield puts it, “it isn’t about what people actually do to innovate better, faster, and cheaper, but about what executives must do to control innovative institutions.” It’s all about top-down plans of action, with the executive issuing a plan to disrupt what you’re doing, and subordinates to carry it out. Hence Suresh’s brushing aside those pesky faculty, who traditionally decide the way that education should be. That might be O.K. for a corporation, but it violates any standard idea of shared governance and academic freedom, which holds that faculty decide the direction of education.

It’s also about politics. The vision of higher education that the New Leaders of higher education would like to install is not a traditional horizontal institution, in which faculty are generally of equal power. (For instance, I’m a professor at Carnegie Mellon like Suresh, so technically I have the same faculty rights and determine the content of my courses and research, not him — and fortunately I have tenure, so he can’t fire me for writing this, which he could if it were a regular corporation.) Rather, it has become an oligarchical institution, reliant on business deals and donations. Business corporations, after all, are not democracies but oligarchies, with decisions running from the owners and executives downhill.

The oligarchical leaning of the New Leadership became clear to me in a talk by Luis van Ahn, a young computer scientist at Carnegie Mellon and MacArthur Award winner. Van Ahn was animated and funny, bringing fresh energy to the proceedings. He evidently had made a killing in developing CAPTCHAs, those difficult-to-decipher wavy letters to verify you’re a human and not a bot online (in his PowerPoint he showed a picture of a man lying in a bed of money, which drew a lot of laughs).

Since then, he has developed and is CEO of Duolingo, a nonprofit designed to bring language training to people for free (or more precisely for their labor). It’s all online, and it’s self-funding: Duolingo sells crowdsourced translations from students to CNN or other businesses in need of them, and the money keeps the company going.

Van Ahn had several tenets of education, the first of which was that “the best education money can buy should be free.” I was with him on that, but I was not so sure about the rest.

One was that the best education should, “Be in your pocket, not in some building.” Again, if education relies on social contact and empathy, then we need a place for it other than the shallow contact of a screen. Think of it from the bottom up: children learn from the synesthesia of sociality, and those who are regularly read to by parents learn to read the soonest. What would a child be like if you locked him or her in a room with a device?

Moreover, while a program like Duolingo might be good for picking up a reading knowledge of a foreign language, I wonder about its transposition to speaking. While van Ahn attests to good testing results online, languages, after all, are not formulae but social. Anyone who has learned a foreign language knows that it’s a much different experience when you’re there, in front of live people.

Still, Duolingo seems like a good thing and an exemplary use of online. However, van Ahn had another tenet: that learning should be through a corporation, not through a government. He said that you cannot trust governments (most “suck” and “other people’s funding usually comes with other people’s ideas and influences”), which he drew from personal experience as an immigrant from Guatemala. That might be understandable in his individual case, but is deeply troubling to anyone who has a Jeffersonian sense of higher education and believes that it should be a public right and to cultivate citizens.

It boggles the mind to think that corporations would be better. What are the guarantees that they would be more free from “other people’s ideas and influences,” particularly of just a few people?

Perhaps if van Ahn is running them. (And still, he sold his previous project to Google, and one might question Google’s proprietorial policies, which we have little recourse to alter.) Governments presumably are based on the will of the people, whereas corporations are based on the will of their owners, boards, and executives, oriented toward gaining the most advantage for themselves. A poor government might fail to represent the will of its people, but the problem then is the lack of democracy. By definition, corporations represent a small, self-interested group.

While van Ahn seems like an admirable person and has put some of his money into good causes, his statement was the credo of plutocracy: the rich and powerful should rule, and their good effects might trickle down. But I don’t trust corporations as much as he does, particularly since they have brought us our current world of severe inequality.

American higher education was conceived as a remedy to inequality in the period after World War II, with policy documents like the 1947 Truman Commission Report setting out a plan to fight inequality “in so fundamental a right as education,” spurring state and federal funding to expand high-quality public colleges and universities and allow a greater number of citizens to attend them for minimal tuition.

The new technology reinstalls inequality, with the wealthy (and a few high-scoring poor) receiving bespoke higher education at elite schools, but most of the rest getting theirs on a screen — with great graphics! like a game!

Jeffrey J. Williams is professor of English and of literary and cultural studies at Carnegie Mellon University. His most recent book is How to Be an Intellectual: Essays on Criticism, Culture, and the University (Fordham University Press).

Editorial Tags: 

Essay argues that young academics should write book reviews

Casey Brienza says that promoting scholarship and the common good of academe is a value that deserves support -- and that this work can also help individual careers.

Job Tags: 
Ad keywords: 
Editorial Tags: 
Show on Jobs site: 

Essay on the problems with academic specialization

In what sense does branching from your original field come with a punishment? Does the academy really want intellectual curiosity?

I am a historian, and I have published in Asian, Pacific, urban and American history. I don’t really consider myself an Asianist of the hardcore variety, (my Mandarin is rusty and my Malay limited), and for all that World History is touted, hiring in that area is often more the old-style “Empire” (“Britain and the World,” “France and the World,” “Iberian Empires” or sometimes “America in the World” which as far as I can tell is the new way of saying diplomatic history).

But unfortunately the academic world still has a need to pigeonhole us. A department will be hiring someone to teach (for instance), colonial North America, or Modern Germany. So obviously they want someone with training in that area. (Never mind that fact that many of us, once in a job, will end up teaching things that are a long way from our specialization.)

Back while I was still at the University of Cambridge in 2007, Simon Schama published a book about the transatlantic slave trade. At a conference, one of the speakers held up the book, slapped it, and said, “How could he write this? He’s an expert on 17th-century Holland!” I thought my Ph.D. was a license to go anywhere in history. Hearing that comment, I wondered whether I had made a huge mistake.

My Ph.D. topic was stumbled into, rather a compromise based on source availability and timing. I am proud of the project (and the book it became), but it’s not an area I wish to pursue further. So I work on different things. Fortunately, I’m in a department now where they don’t seem to mind what I research, as long as I’m publishing. But to grant agencies, I think I look a bit flaky.

And certainly to people like that conference speaker, I present an odd figure. I assumed that my training in history (in Susan Stryker’s words, a “black belt in looking shit up”) meant I could turn those skills to any period of history (language issues notwithstanding). I never realized I would be shackled to my Ph.D. topic for the rest of my life (perhaps because the historians I most admired, like Schama, are those who had displayed broad intellectual curiosity and turned their focus on widely divergent regions and periods).

In terms of history outside of the academy, the general public wants broad declarative histories. Books on the theme of “The X that changed the world” are common (even histories of apparently small things have to be on the grand stage). Meanwhile in academe our focus remains narrow. There was once a time when academic historians wrote broad narratives for dissertations. Then we turned to ever smaller elements of history, to be examined to a microscopic level. David Armitage and Jo Guldi have suggested we may be returning to the longue duree in academic works, but it may be slow in coming.

I still believe that the training of a doctoral program should allow us to use those skills anywhere, allowing for the time required to get up to speed on the scholarship in a new field. After all, if I could do that in three years as a fresh graduate student, I should be able to do it again now (and probably quicker since I’ve done it before). It disturbs me that there are people who believe our ability to learn and grow as scholars should end the second we are handed our Ph.D.s (with our future publications just being further iterations of the same subject as our dissertation).

With the growing need for Ph.D.s to consider careers outside the academy, a broader perspective is useful -- nonprofits, think tanks and museums want broad skills and flexibility, not narrow interests. This means also having open-minded professors -- open to careers outside academe, and open to different fields.


Katrina Gulliver is a lecturer in history at the University of New South Wales. You can find her most of the time on Twitter @katrinagulliver.

Editorial Tags: 

Essay on "The Americans," and Margaret Peacock, "Innocent Weapons: The Soviet and American Politics of Childhood in the Cold War"

It was too prolonged for there to be any specific date, or dates, to mark it. But perhaps this is as good a time as any to mark the 25th anniversary of a process that started with the fall of the Berlin Wall in early November 1989 and reached a kind of peak with the events in Romania late that December.

The scale and pace of change were hard to process then, and difficult to remember now. Ceausescu had barely recovered from the shock of being heckled before he and his wife faced a firing squad. It was not how anyone expected the Cold War to end; insofar as we ever imagined it could end, the images that came to mind involved mutually assured destruction and nuclear winter.

A few years ago, Daniel T. Rogers characterized the intellectual history of the final decades of the 20th century as an “age of fracture” – an era in which the grand narratives and overarching conceptual schemata were constantly displaced by “piecemeal, context-driven, occasional, and… instrumental” ideas and perspectives in the humanities, social sciences, and public life. Fair enough; just try finding a vintage, unshattered paradigm these days. But a system of bipolar geopolitical hostilities prevailed throughout most of that period, and the contradictory structure of conflict-and-stasis seemed very durable, if not permanent.

Until, suddenly, it wasn’t. One smart and well-executed treatment of the world that came to an end a quarter-century ago is a recent television series called "The Americans," set in the early 1980s. The first season is now available in DVD and streaming video formats, and the second will be in two weeks, just in time for binge-viewing over the holidays.

"The Americans" is a Cold War spy drama as framed by the “secret life amidst suburban normality” subgenre, the basic tropes of which were inaugurated by "The Sopranos." In it, the Jenningses, a married couple, run a travel agency in Washington, where they live with their two early-adolescent kids. But they are actually KGB agents who entered the United States some 20 years earlier. They have operated from behind cover identities for so long that they blend right in, which makes them very effective in their covert work. While gathering information on the Strategic Defense Initiative, for example, they even get access to the Advanced Research Projects Agency Network -- aka ARPANET -- which allows communication between computers, or something.

The comparison shouldn’t be pushed too hard, but the paradox of the deep-cover agent is right out of John Le Carré: A divided identity makes for divided loyalties. At very least it puts considerable strain on whatever commitment the couple started out with, back in the late Khrushchev era. We get occasional flashbacks to their life as young Soviet citizens. With the onset of “Cold War II,” the motherland is imperiled once again (not only by the American arms buildup but also by the reflexes of KGB leadership at “the Center”) and the Jenningses have decidedly mixed feelings about raising kids under rampant consumerism, even if they’ve grown accustomed to it themselves.

The moral ambiguities and mixed motives build up nicely. Life as a couple, or in a family, proves to be more than a layer of the agents’ disguise: love is another demand on their already precarious balance of loyalties. Yet the real menace of thermonuclear showdown is always there, underneath it all. Some viewers will know that things came very close to the point of no return at least once during this period, during NATO’s “Able Archer” exercise in November 1983. Whatever sympathy the audience may develop toward the Jenningses (played with real chemistry by Keri Russell and Matthew Rhys) is regularly tested as they perform their KGB assignments with perfect ruthlessness. They are soldiers behind enemy lines, after all, and war always has innocent casualties.

The conflict has gone on so long, and with no end in sight, that the characters on screen don’t even feel the need to justify their actions. The spycraft that the show portrays is historically accurate, and it gets the anxious ground-tone of the period right, or as I remember it anyway. But very seldom does "The Americans" hint at the impending collapse of almost every motive driving its core story -- something the viewer cannot not know. (Pardon the double negative. But it seems to fit, given the slightly askew way it keeps the audience from taking for granted either the Cold War or the fact that it ended.)

The focus on the family in "The Americans" takes on added meaning in the light of Margaret Peacock’s Innocent Weapons: The Soviet and American Politics of Childhood in the Cold War, recently published by the University of North Carolina Press. The scriptwriters really ought to spend some time with the book. At the very least, it would be a gold mine of nuances and points of character development. More generally, Innocent Weapons is a reminder of just how much ideological freight can be packed into a few messages rendered familiar through mass media, advertising, and propaganda.

Peacock, an assistant professor of history at the University of Alabama at Tuscaloosa, examines the hopes and fears about youngsters reflected in images from the mid-1940s through the late 1960s. The U.S. and the USSR each experienced a baby boom following World War II. But the outpouring of articles, books, movies, and magazine illustrations focusing on children was not solely a response to the concerns of new parents. It might be more accurate to say the imagery and arguments were a way to point the public’s attention in the right direction, as determined by the authorities in their respective countries.

Children are the future, as no politician can afford to tire of saying, and the images from just after the defeat of fascism were tinged with plenty of optimism. The standard of living was rising on both sides of the Iron Curtain. In 1950 President Truman promised parents a “the most peaceful times the world has ever seen.” Around the same time, the Soviet slogan of the day was “Thank You Comrade Stalin for Our Happy Childhood!”, illustrated with a painting of exuberant kids delivering an armful of roses to the General Secretary, whose eyes fairly twinkle with hearty good nature.

But vows of peace and plenty on either side were only as good as the leaders’ ability to hold their ground in the Cold War. That, in turn, required that young citizens be imbued with the values of patriotism, hard work, and strong character. Sadly enough, children on the other side were denied the benefits of growing up in the best of societies.

The Soviets media portrayed American youth as aimless, cynical jazz enthusiasts facing Dickensian work conditions after years of a school system with courses in such topics as “home economics” and “driver’s education.” The Americans, in turn, depicted Soviet youth as brainwashed, stultified, and intimidated by the state. (And that was on a good day.)

By the late 1950s, the authorities and media on each side were looking at their own young people with a more critical eye (alarmed at “juvenile deliquincy,” for example, or “hooliganism,” as the Soviets preferred to call it) -- while also grudgingly admitting that the other side was somehow bringing up a generation that possessed certain alarming virtues. Khrushchev-era educational reformers worried that their students had endured so much rote instruction that they lacked the creativity needed for scientific and technological progress, while American leaders were alarmed that so many young Soviets were successfully tackling subjects their own students could never pass -- especially in science and math. (The news that 8 million Soviet students were learning English, while just 8,000 Americans were taking Russian, was also cause for concern.)

The arc of Cold War discourse and imagery concerning childhood, as Peacock traces it, starts out with a fairly simplistic identification of youth’s well-being with the values of those in charge, then goes through a number of shifts in emphasis. By the late 1960s, the hard realities facing children on either side were increasingly understood as failures of the social system they had grown up in. In the U.S., a famous television commercial showed a little girl plucking the leaves of a daisy as a nuclear missile counted down to launch; while the ad was intended to sway voters against Barry Goldwater, it drew on imagery that the Committee for a Sane Nuclear Policy (better known as SANE) and Women Strike for Peace first used to oppose nuclear testing a few years earlier. Nothing quite so emblematic emerged in the Soviet bloc, but the sarcastic use of a slogan from the Komsomol (Young Communist Union) became a sort of inside joke about the government’s self-delusion.

“To varying degrees,” writes Peacock, “both countries found themselves over the course of these years betraying their ideals to win the [Cold] war, maintain power, and defend the status quo…. Even images like that of the innocent child can become volatile when the people who profess to defend the young become the ones who imperil them.”


Editorial Tags: 

Essay on doing well in academic job interviews

Melissa Dennihy wants you to be prepared.

Job Tags: 
Ad keywords: 
Editorial Tags: 
Show on Jobs site: 
Image Source: 
Getty Images

Essay on character sketches and a 'typology of scholars'

The Greek philosopher and scientist Theophrastus would probably have remained forever in the shadow of Aristotle, his teacher and benefactor (a very big shadow, admittedly) if not for a little volume of personality sketches called Characters he wrote at the age of 99. At least that’s what he claims in the preface. The first character type he portrays is called “The Ironical Man,” so it’s possible he was just putting everyone on.

After long years of people-watching, Theophrastus says, he resolved to depict “the good and the worthless among men,” although what we actually have from his pen is a rogues’ gallery of shady, annoying, or ridiculous characters – 30 in all – including the Boor, the Garrulous Man, the Superstitious Man, and the Man of Petty Ambition. It could be there was a second volume, depicting virtuous and noble personality types, which has been lost. Or maybe he intended to write one but never got started, or did but quit from boredom. When he focuses on weaknesses and foibles it is with relish. The Offensive Man “will use rancid oil to anoint himself at the bath; and will go forth into the market-place wearing a thick tunic, and a very light cloak, covered with stains.” The Patron of Rascals “will throw himself into the company of those who have lost lawsuits and have been found guilty in criminal causes; conceiving that, if he associates with such persons, he will become more a man of the world, and will inspire the greater awe.” And so forth.

It’s not hilarious, but the humor works, and the types all remain familiar. The adjustments a reader has to make between Theophrastus’s references to clothing and institutions in ancient Greece and everyday life today are pretty slight. It’s not hard to understand why Characters became a fairly popular work in antiquity and then again in the 17th century, when imitations of it became a literary fashion and an influenced early novelists.

The character sketch seems to have died off as a genre some while back, apart from for the occasional homage such as George Eliot’s The Impressions of Theophrastus Such, her last work of fiction. But I recently came across a sort of revival in the form of a series of sketches called “Typology of scholars” by Roland Boer, an associate professor of philosophy and theology at the University of Newcastle, in Australia.

A couple of weeks ago Boer won the Isaac and Tamara Deutscher Memorial Prize (sort of an equivalent of the Pulitzer for Marxist scholarship) for Criticism of Heaven and Earth, a study of the ongoing interaction of Marxism with theology and the Bible – the fifth volume of which just appeared from the European scholarly publisher Brill, with previous installment issued in paperback from Haymarket Press. I would be glad to write about it except for being stuck in volume two. The news that volume five brings the series to a close is somewhat encouraging, but in the shorter term it only inspired me to look around at his blog, Stalin’s Moustache. (Anyone attempting to extract ideological significance from that title does so at his or her own peril. Boer himself indicates that it was inspired by General Tito’s remark “Stalin is known the world over for his moustache, but not for his wisdom.”)

In the inaugural post Boer explains that “Typology of scholars” was inspired by a single question: “ ‘What is a university like?’ someone asked me who fortunately had no experience whatsoever with these weird places.” Originally announced as a series that would run for a few days in December 2011, it actually continued for six months, though the last few character sketches are lacking in the piss and vinegar of the first several. 

Like Theophrastus, Boer, too, has assembled a rogue’s gallery. Whatever the particularities of Australian academic culture, “Typology” depicts varieties of Homo academicus probably found everywhere.

So here’s a sampler. (I have imposed American spelling and punctuation norms.)

The Lord: “[T]he high-handed professor distributes funds, hands out favors, relies on a servile court of aspiring scholars to reinforce his own sense of superiority. And the Lord deems that the only people really worth talking to are other lords, visiting them in their domains, perhaps lecturing the local serfs.…

"[N]otice how the lord refers to ‘my’ doctoral students, ‘my’ postdocs, ‘my’ center, or even ‘my’ university. Our worthy lord may treat her or his serfs in many different ways, with benevolence, with disdain, as a source of new ideas that can then be ‘recycled’ as the lord’s own. So the serfs respond accordingly, although usually it is a mix of resentment and slavish subservience. On the one hand, the lord is a tired old hack who is really not so interesting, who may be derided in the earshot of the other serfs, and whose demise cannot come quickly enough. On the other hand, the serfs will come to the defense of their lord should an enemy appear, for they owe their livelihood and future prospects to the Lord.

“Of course, as soon as a serf manages to crawl into a coveted lordship, she acts in exactly the same manner.”

The Overlooked Genius: “The way the star system has developed in academia means that few of us are happy to remain incognito, quietly walking in the mountains and jotting in a small notebook, sending books off to a press and selling maybe four or five – like Nietzsche (although he was also pondering the advanced effects of syphilis).... In order to make it through the long apprenticeship, at the end of which someone who is forty is still regarded as ‘youthful,’ an intellectual needs to develop some survival skills, especially a belief that what he or she is doing is important, so crucial that the future of the human race depends upon it...... Our unnoticed genius spends his or her whole time asserting that everyone around him or her, in whatever context if not the discipline as a whole, is as dumb as an inbred village.”

The Borrower: “The Borrower may seize upon the papers of a colleague who has resigned in disgust and use them, unrevised, in a scintillating paper. Or the Borrower may ask a newly made ‘friend’ for a copy of his latest research paper, only to pump out something on the same topic and publishing it quickly – using established networks. Possibly the best example is a very creative head honcho at an unnamed university. Dreading solitary space, a blank piece of paper, and an empty mind, he would gather a group at his place, ply them with food and grog, and ask, ‘Now what about this question?’ After a couple of hours talk, he would say to Bill: ‘Why don’t you write a draft paper on this and then pass it around?’ Bill would do so, the others would add their revisions and he would ensure his name was on the piece.”

The Politician: “It can truly be said that the politician is one who has never had a thought without reading or hearing it somewhere else. What really sets the juices going, however, is the thought of wielding ‘power.’

“So she or he salivates at the thought of a committee, leaps at the chance for a heavy administrative role in which power can be lorded over others, sleeps with this one or that higher up the rung in order to gain crucial insights that may come in handy, who spends long hours pondering his next move to gain access to the powers that be.

"With their feudal-like structures, universities lend themselves to labyrinthine intrigue, favors done, gaining the ear of a heavyweight, eliminations carried out through humiliation and whispers, the bending of rules in order to edge ever upward.… The Politician is probably one of the saddest of all types, since the power you can accrue in a university is bugger-all.”

The Big Fish in a Slimy Pond: “A great temptation for some of us in that attractive life of academia: this is, obviously, the situation in which one may be a big shot in one’s own little circle. ... [able to] hold forth on any topic with absolute abandon…. “

The moment of truth for the Big Fish comes when faced with “a big conference, or perhaps a new and larger circle of scholars who actually know something, or a situation slightly more than a group of fresh-faced, worshipful students. … [O]n the one visit to the big arena, our knowledgeable scholar opines that no-one knows what they are talking about, since all those hundreds of papers from around the world are worthless, so it’s not worth going again. Or they are too traditional and I’m just too much of a radical for them all, so I’ll give that a miss….

“Instead, the apparently big fish can return to the small, stagnant pond, getting fat on pond slime and the perceived authority that comes from being the person with one eye among the blind.”

(It's worth mentioning that he example of the Big Fish that Boer gives is "the theologian who becomes an expert in, say, feminism, or cultural criticism, or Marxism, but stays purely within theology where she or he is a real 'authority.' God forbid that you should actually spend some time with real gender critics, or Marxists, or psychoanalysts." His character sketch is a self-portrait, or at least piece of a self-satire.)

As noted earlier, imitations of the character sketches of Theophrastus became popular in the 17th century, when (maybe not so coincidentally) the novel was starting to take shape as a distinct literary form. Plot was, in effect, the boiling water into which authors dissolved the little packets of crystallized personal psychology found in Characters and its knock-offs.

It occurs to me that Boer’s “Typologies of the scholar” reverses the process: It’s an academic novel, except in freeze-dried form. That’s not a problem, since plot is rarely an academic novel’s strong point. What the reader enjoys, typically, is character as caricature -- and Boer’s approach is arguably a lot more efficient.

Editorial Tags: 

Introduction of new career advice column for minority academics


Kerry Ann Rockquemore answers the questions she most often receives from minority scholars launching their careers.

Job Tags: 
Ad keywords: 
Editorial Tags: 
Show on Jobs site: 
Image Caption: 
Kerry Ann Rockquemore
Image Size: 

Illinois trustees appear open to continued work for an adjunct with criminal past

Smart Title: 

U. of Illinois board issues statement creating a path for James Kilgore -- who served prison time for his activities with the Symbionese Liberation Army -- to resume teaching.

Essay critiques the role of theory in the humanities

Over the years, as literary studies veered into a dozen political and identitarian versions of theory, traditionalists complained accordingly, but nothing they said altered the trend. Conservatives, libertarians, and, in some cases, liberals produced government reports (William Bennett’s National Endowment for the Humanities study "To Reclaim a Legacy"), wrote best-selling books (Allan Bloom’s The Closing of the American Mind), and spoke at legislative hearings (David Horowitz and the Academic Bill of Rights campaign), but the momentum toward political and identity themes proceeded without pause. Sexuality studies are stronger today than they were 20 years ago.

One reason, I think, is that defenders of the new managed to characterize objectors in just the right way to discredit them. Voices opposing deconstruction, postcolonialism, and the rest were cast as ignorant, retrograde, threatened, resentful, out of touch, and hidebound, traits nicely keyed to decertify them for academic recognition.

Paul Jay’s essay here is a fair example. It chides the speakers at a St. John’s College gathering for “recycl[ing] an old and faulty argument that should have been set aside years ago.” Indeed, Jay says, the whole spectacle was unworthy of academic discussion: “it’s depressing to see such a thoroughly discredited argument being made in late 2014.”

The argument he deplores is that the rise of theory has brought about the downfall of English and the humanities. Race-class-gender studies, political criticism, feminism, deconstruction, and other schools of theory have turned students away, it claims, the professors abandoning the experience of beauty and greatness, and thereby killing their own field.

Jay counters with statistics showing that English enrollments have held steady for decades after a precipitous fall in the 70s.  The “plight” of the humanities is real, he acknowledges, but it stems from broader shifts on campus, particularly the adoption of corporate and vocational values.  Traditionalists misconstrue the evidence because they want to “eschew critique” and “return to ‘tradition’” (note the sneerquotes).

Once again, traditionalists are backward and uninformed. We have the same set-up, one that denies them any affirming values and frames the position in terms of intellectual deficiency. It’s unfair, but it has worked.

Rather than protest this bilious characterization, then, let’s go with it and flesh it out, and emphasize a different attribute in the profile. It isn’t wrong to highlight personal factors in the traditionalist response, and in this case they certainly fueled the outcry and enmity against theory and politicization. But if we’re going to do so, let’s include a fuller range of them, not just insularity and defensiveness. 

I have in mind another condition. It applies to critics of the theory/politics/identity turn who were, in fact, quite knowledgeable of the intricacies of theory, its philosophical and historical backgrounds. Their response even derived, at times, from admiration of Discipline and Punish, A Map of Misreading, “Turning the Screw of Interpretation,” and other canonical 70s and 80s texts.

I mean the feeling of embarrassment. Not embarrassment for themselves, but for their discipline. It sounds ego-based and irrelevant, but it derived from a scholarly posture, not a personal state, and it happened again and again.  As they went about their professional work, teaching and speaking, reviewing manuscripts and candidates, reading new books and essays, they witnessed persistent lapses in learning, research, and evaluation, a series of poor performances that nonetheless passed muster. Enough of them piled up for traditionalists to count it a generalized condition — and they mourned. Decades of immersion in the field presented one breakdown after another, and they cared so much for the integrity of the discipline that it affected them as a humiliation.

We were embarrassed ...

  • When we attended lectures by professors who cited Jacques Derrida but in the follow-up Q&A couldn’t handle basic questions about Derrida’s sources.
  • By the cliques that formed around Derrida, Paul de Man, Foucault, and other masters, complete with sibling rivalries, fawning acknowledgements, and sectarian hostilities.
  • By graduate students skipping seminars in order to deliver theory-saturated conference papers, even though they needed three years of silent reading in a library carrel before stepping forward.
  • When departments dropped bibliography, foreign language, and philology requirements, but added a theory survey.
  • When Jesse Jackson & Co. pulled the “Western civ has got to go!” stunt at Stanford and English colleagues reacted with a pathetic “O.K.”
  • When Hilton Kramer and Roger Kimball penned their annual report on the Modern Language Association in The New Criterion, and the world guffawed.
  • By the Sokal Hoax, which made us a laughingstock among our science colleagues.
  • By the Bad Writing Award and cutesy titles stocked with parentheses, scare quotes, and diacritical marks.
  • When we came across reader’s reports and found them nothing more than puff pieces by cronies.
  • By Academically Adrift, which demonstrated how little reading and writing undergraduates do.

Yes, we stumbled from one chagrin to another. When Jay effuses about “the innovative role that theory has had in deepening, enriching, and challenging our understanding of the human,” we can only reply, “That’s not what we saw and heard with our own eyes and ears.” Jay treats it as transformative progress, but it impressed us as hack philosophizing, amateur social science, superficial learning, or just plain gamesmanship. Our first response wasn’t hostility or insecurity. It was dismay. 

This is why we blamed theory, and still do. We didn’t deny the genius of eminent theorists, but we found the practices they inspired dispiriting. Not Derrida’s “Differance,” a serious ontological statement, Elaine Scarry’s The Body in Pain, an eccentric but hefty study, and other achievements, but their thousands of phony imitations and platitudinous implementations, and theory had to accept responsibility for those results. 

First of all, theory called into question epistemological standards. “Objectivity,” “method,” the distinction of “primary” and “secondary” texts, and other disciplinary concepts fell prey to its critique.

Second, theory was unfamiliar, and so you could get by with half-baked expressions of it. If you referred in a gathering to a passage in Jacques Lacan’s “Rome Discourse,” chances are that few others in the room had the knowledge to assess your usage.

Third, theory (starting in the '80s) was aligned with political trends bearing a moral authority, encouraging people to think more about “doing good” than “doing well.” We didn’t criticize that young professor for his disorganized teaching, because he enacted a social good: introducing undergraduates to marginalized authors of color and outlining theories of their marginalization.

Finally, theory had a smaller corpus and broader application than existing historical fields. It saved younger people months and years of reading time.

It didn’t have to happen that way (who loved the archive more than Foucault?), but it did. Every profession has greater and lesser talents, of course, but it seemed to us that inferior knowledge, skills, and standards had become routine practice, and theory stood as an alibi for them.

So, when traditionalists speak up and the Establishment knocks them down, keep in mind the other attribute, not the stupidity that marks their failure to meet scholarly ideals. Consider, instead, their embarrassment over the decades, which originates precisely in their enduring devotion to those ideals.


Mark Bauerlein is professor of English at Emory University.

Editorial Tags: 
Image Source: 
Image Caption: 
Jacques Derrida


Subscribe to RSS - Humanities
Back to Top