The most famous of us all are not real. True, scholars such as Albert Einstein and J. Robert Oppenheimer were once recognized by almost any sector of the American public. In fact, they were so well-recognized that Einstein’s hair and Oppenheimer’s pork pie hat were alone representative of their celebrity.
A theoretical physicist, an astrophysicist, an applied physicist, and an engineer are now arguably as well recognized as the Einsteins and Oppenheimers of days past. The problem is that these men, Sheldon Cooper, Rajesh Koothrappali, Leonard Hofstadter, and Howard Walowitz, are not real. They are, in fact, the stars of CBS’s "The Big Bang Theory. "
Just how popular are the show and its stars? "The Big Bang Theory" begins its seventh season tonight and frequently rode atop Nielsen’s weekly ratings for sitcoms in past years. Beyond sheer volume of viewers, "The Big Bang Theory" has also garnered a wide variety of awards. This year alone, for example, the show was nominated for eight Emmys and took home top honors for Outstanding Lead Actor in a Comedy Series (Jim Parsons, a.k.a. Sheldon Cooper) and Outstanding Guest Actor in a Comedy Series (Bob Newhart, a.k.a. Arthur Jeffries/Professor Proton).
Like a number of current sitcoms, the male protagonists are portrayed as being afflicted by variant strands of perpetual adolescence. If they are not working, they are playing online role games, hanging out at a comic book store, or ingesting successive waves of takeout. Of course, a sitcom must include a subplot of ongoing sexual frustration, and "The Big Bang Theory" does not disappoint. The lone exception is the theoretical physicist who views "coitus" – as he calls it – as a mere distraction from his work.
Given the show’s appeal, what, if anything, does it tell us about the American public’s views of the academic vocation? Speaking on behalf of what the American public thinks is risky, but I fear we all may already know the answer — they find the show humorous because it, in part, correlates to opinions they already hold.
For example, in one of the final episodes from last season, entitled "Tenure Turbulence," a tenured slot comes open in the physics department when a colleague dies. When discussing the possibility, the theoretical physicist with coitus avoidus, Sheldon, claims "a guaranteed job for life only encourages the faculty to become complacent." The astrophysicist, Rajesh, argues "people do their best work when they feel safe and secure." Regardless, they all initially agree whoever among them receives tenure should do so because of his ability to do the work, not because of faculty politics.
Events then spin out of control as each one of them seeks to one-up the other in an arms race of university politics. The target for their outlandish behavior is Mrs. Davies, a member of the human resources office serving on the tenure committee. Leonard risks being placed on a stalker watchlist by making his way into the previously unexplored territory of the wellness center simply to “schmooze” Mrs. Davies while she exercises. Raj sends her a self-made video touting his academic abilities dating back to his early childhood. Not to be outdone, Sheldon provides Mrs. Davies, an African-American, with the DVD box set of "Roots."
Just when you think you have seen it all, the most outlandish behavior comes just prior to the deceased colleague’s funeral. Standing in the hallway, each one of them becomes aware of the depraved lengths the others will go in this political game. Sheldon asks his girlfriend, Amy, to remind him that an appropriate emotional response to a funeral is sadness. Perennially incapable of speaking to women, Raj is left to rely on alcohol to help him be more assertive.
Despite their antics, Mrs. Davies recommends all three candidates for further review as a result of their considerable credentials. In a mere half-hour, however, a number of possible cultural stereotypes of the life of university faculty members are brought to light. One possible stereotype held by the larger public has to do with skepticism over the possibility of someone having access to a job for life. The second has to do with how such a job is earned.
Unfortunately, the best available data confirms the existence of both forms of skepticism. Although somewhat dated but arguably still the most authoritative of its kind, Neil Gross and Solon Simmons conducted a survey of "Americans’ Views of Political Bias in the Academy and Academic Freedom" back in 2006 for the AAUP. A more recent iteration of this line of work is now found in Neil Gross’s Why Are Professors Liberal and Why Do Conservatives Care? (Harvard University Press, 2013).
In general, Gross and Simmons found "Americans are generally supportive of the tenure system.... At the same time, about 80.7 percent think that tenure sometimes protects incompetent faculty, while 57.9 percent believe that giving professors tenure takes away their incentive to work hard." As a result, "only about 17.9 percent of respondents say the tenure system should remain as it is.”
In six-going-on-seven seasons, tenure is but one of the important issues portrayed in episodes of "The Big Bang Theory." Part of the reason why we laugh, though, is the way it mirrors views held by the American public and possibly by even some of us. Tenure and other practices like it are too critical to the work we do to be unquestioningly portrayed in such a manner.
The challenge facing us, those of us who are real, is how our efforts persistently challenge such perceptions. Perhaps one day a sitcom will climb the Nielsen ratings portraying tenure as a practice so revered that it inspires nothing but the highest devotion to teaching, research, and service.
Todd C. Ream is professor of higher education at Taylor University and a research fellow with the Institute for Studies of Religion at Baylor University. He (along with Drew Moser) is currently working on a cultural biography of Ernest L. Boyer.
About this time of year one invariably reads fulsome, even orgiastic essays by academics professing the exhilaration and sense of joy they feel on the first day of class each August or September. In so doing, they often blather on about limitless possibilities and rituals of renewal, etc., and wax on about frisson A and epiphany B on the quad.
I must admit that my experience is quite different. Whereas for many professors the beginning of the academic year is a time of excitement and anticipation, for me it is — indeed, has been for the 30-plus years I’ve been teaching at the university level — a time of melancholy, even gloom. Indeed, late August/early September marks the peak period of my annual bout of SAD. To most clinicians, SAD denotes "seasonal affective disorder," a condition in which normally well-adjusted people experience a range of depressive symptoms, but for me SAD means "student affective disorder." Same symptoms, different etiology.
Around the beginning of August -- even earlier now -- I begin to suffer the symptoms: heightened anxiety; enervation; difficulty concentrating; social withdrawal; increased irritability; nausea. Over time, I’ve found that the reason for the onset of such conditions is the looming return of STUDENTS into my life.
It is not August, but the end of the exam period in May that elicits in me a sense of joy and limitless possibilities. Only when my grades are turned in, the seniors graduated, and the dorms emptied out do I begin to feel a sense of excitement and anticipation and the possibility for renewal. For it is only then that I can focus on research and writing without the threat of being interrupted by tedious office hours, middle-of-the night phone calls, and "urgent" e-mails ("Can I get an extension on my book review?"), not to mention lectures, seminars, grading, meetings, committee work, etc., etc.
Rather, with May comes "summer break" and the tantalizing possibility of finally honoring long overdue writing commitments, of making headway on a scholarly monograph, and of thinking deeply about new projects down the line. If sufficiently lucky, it might mean a trip or two to an archive to immerse oneself in source materials one has waited months, if not years to dive into. And it might even give one a chance to attend a conference, present a paper, and get some useful feedback from experts in one’s field. Talk about renewal!
But, alas, before one knows it, August comes around. David M. Shribman recently wrote a beautiful essay in The Wall Street Journal entitled "Whatever Happened to August?," an elegiac piece lamenting that August, once the Platonic ideal of summer, has been turned into a "month of work, school and calendars run amok." Nowhere is this more true than at universities. At colleges and universities across the land, the ecological system in town begins to become student-centric earlier and earlier each year, with suck-up seniors, jaded juniors, sophomoric sophs, eager-beaver fresh-faced frosh transforming the placid summer landscape on campus into a crowded cacophonous mob scene.
Even worse, by then the seemingly “limitless possibilities” for summer, the best-laid plans, the hopes and dreams have all been dashed. Some of the overdue commitments are still outstanding. Progress has been made on the unfinished monograph, but it still sucks. The trips to the archives brought disappointing results. The professional meetings were as boring as ever. And now the students are back. Any wonder that I get depressed?
To make things even worse, it seems more and more as though “summer break” is over just after the 4th of July. That’s when the first, vague symptoms of SAD begin to appear. They accelerate through July and peak about the third week of August when classes begin, at which time I feel like I’m about to embark on the academic equivalent of a death march.
Funny, though, every year the symptoms recede. Gradually, I adapt to the new ecology and again find my niche. By mid-to-late September — usually a few weeks after Labor Day — the symptoms are gone and I begin to feel like myself. The "new" landscape has been naturalized. I again begin to appreciate students — the putative causes of my seasonal plague — suck-up seniors, jaded juniors, sophomoric sophs, fresh-faced frosh all.
Peter A. Coclanis is Albert R. Newsome Distinguished Professor of History and director of the Global Research Institute at the University of North Carolina at Chapel Hill.
As pervasive as it is perilous, the recurrent use of two words — "real world" — crystallizes many problems confronting the academy today.
The term gestures toward all spheres beyond the so-called ivory tower; an advertisement in the New York City subways lauded the "real world" experience of teaching in the New York Police Academy. But often this expression more specifically refers to the world of business. When it simply serves as shorthand to distinguish those realms from the university, the reference may be innocuous. And yes, professors and academic administrators indubitably benefit from learning from and collaborating with their counterparts outside those proverbial ivy-covered walls. As a faculty representative, I worked closely with the trustees of Carleton College on a presidential search; these interactions repeatedly demonstrated to me their shrewdness in evaluating people and the practical needs of any organization, thus dissipating lingering prejudices about the business world and reminding me that its variety complicates generalizations about it.
More often, though, contrasting the "real world" outside the academy with its putatively unreal counterpart within is pernicious for three interlocking reasons. First, the two words in question often frequently reflect and encourage self-denigration, even abnegation. Many people outside the academy regard its denizens in the way nuns are sometimes dismissively seen -- as exemplars of a life that in theory one may respect but in practice one greets with bemused condescension. Academics themselves sometimes on occasion refer to the "real world" because they have internalized such judgments. The strategic use of those two words in influential studies of higher education can reinforce these prejudices and insecurities. Thus Louis Menand’s Marketplace of Ideas tellingly defends pre-professional and vocational courses, in contrast to the traditionally defined liberal arts curriculum, in terms of their fulfilling "real-world goals."
Second, by implying that alternative values are unrealistic — indeed, naive -- these two words are likely to justify the increasing importation of certain troubled and troubling "real world" business practices. This shift has been tellingly encapsulated as the recent corporatization of the university, notably in Frank Donoghue’s The Last Professors: The Corporate University and the Fate of the Humanities. The lamentable reliance on adjuncts is all too reminiscent of the emphasis on outsourcing in the business world. It is equally dangerous uncritically to copy hierarchies prevalent though not universal in business communities, as the trustees at the University of Virginia learned to their cost. Higher education’s star system went to school on Wall Street (and quite possibly in Hollywood as well). And "scorecards" that rate universities by the amount of money their graduates make after graduation similarly impose the worst values of the corporate milieu.
Third, distinguishing the "real world" of business from the unreal world of the academy misrepresents for better and for worse the longstanding workings of our institutions of higher education themselves. The very term "real" is clearly slippery ("reality TV"? "The Real Housewives of Orange Country"?); but many connotations — not all of them grounds for rejoicing-- do in fact already apply to the academy. To the extent that the adjective gestures toward the competition among ambitious people, many academics and leaders of their institutions not only read books about those issues but also, so to speak, wrote the book on them. The frequent references to “branding” within the academy demonstrate that marketing executives could teach certain admissions officers and other administrators nothing they have not long known about the half-truths that practice can foster.
But in fact the university is also a world committed to, indeed exemplary of, the "real" in more positive respects. Arguably our attention to using language carefully — teaching writing is surely a significant part of the mission of institutions of higher education — in fact encourages conveying a real picture, expressing what one really intends to say. Our emphasis on critical thinking, notably the marshaling of evidence, trains students to distinguish the real from the specious and self-serving. Alternatively, even if one subscribes to the poststructuralist credo that language can never express reality, we can still encourage those students to discern and distinguish positions along a spectrum between reality and deceit. In so doing, we achieve one goal central to a liberal arts education: building the very faculty of discernment — a capacity that, besides its many other potentialities, can and should encourage a re-evaluation of the expression "real world."
Heather Dubrow is the John D. Boyd SJ Chair in the Poetic Imagination at Fordham University. Among her publications are six single-authored monographs, a co-edited collection of essays, an edition of As You Like It, and a volume of her own poetry.
Two years into my doctoral program in English at the University of California at Santa Barbara, I left to spend a year in Paris doing research. Among other things, when I returned home, I was a year behind in doctorate colloquium, a course required by our department to prepare graduate students for the wondrous, painstaking world of dissertation prospectus writing.
I found myself in a classroom with our department chair, who ran the class, and six other students, all in the cohort one year junior to me. Because so many different professors in our department had taken turns teaching the doctorate colloquium, our chair would sometimes come to class with a large file folder, filled with papers of advice, samples, and notes from former professors. It was a neat little archive, and I imagined that it probably revealed a lot about the changing trends in academe and the shifting job market.
One day in class, our current chair at the time, Professor Alan Liu, picked up a Post-It note in the file and smiled. He looked up at the class and said softly, his voice full of warmth: "It’s a note from Richard. Look, his handwriting..." and lifted the Post-It to the class. I felt an immediate lurch inside me, and I blinked, smiled weakly, and looked around the classroom expecting to meet a similar expression on the faces of my classmates. But it was at that moment that Alan and I realized, possibly simultaneously, that no one else in the class had had Richard as a professor. He had died my first year of graduate school, and my original cohort was the last group of new graduate students lucky enough to be in his classroom.
Upon entering graduate school at barely 22 years of age, I was still much like an undergraduate. I was often critically unaware of what I was saying, but managed to chatter on quite a lot (probably to the delight of the older graduate students). I took Richard Helgerson’s class in the winter of 2008, the last class he taught before he died. It was required for students to take a Renaissance literature course, and as a 20th-century person, I was excited about the excursion into Shakespeare and Milton (not so much Spenser). Before I left for graduate school, a friend of mine who was an early modern scholar at the University of Wisconsin at Madison exclaimed his jealousy when he heard I would be attending UCSB. "I can’t believe you’re going to get to take classes with Richard Helgerson," he gushed. This was lost on me at the time, but I soon became aware of Richard’s celebrity as a scholar and teacher. I was set to take his class and find out firsthand what made this man such a literary legend.
But the first day of class was something unexpected. Richard came to class with a bucket. He said he might need it as he was ill. He was very matter-of-fact, reasonable, even apologetic. He had pancreatic cancer, he explained. Despite his condition, Richard’s mental sharpness and brilliance in the classroom made that course one of my most memorable experiences of graduate school. He was endlessly curious about our ideas, and I could tell that he valued the discussion of the texts above all else. The class had a kind of intensity to it, the most presence I’ve felt in a graduate seminar. Everybody read the work. Everybody participated. Everybody listened.
Most surprisingly, Richard maintained such a wonderful sense of humor throughout the course, especially with the younger, more inexperienced graduate students (ahem). As a modernist, I felt such a freedom to speak in that class. Once, I even told Richard that Spenser’s calendar was sort of like the Beach Boys’ "Pet Sounds," "you don’t want to admit it, but there is a weak link... and that link is November," I said with a smug face, thinking I had said something quite associative and brilliant. He didn’t laugh, but rather looked quite amused and pleased. This seemed to be a bold thing for a young graduate student to say with so much conviction. He was even more tickled after an older graduate student explained to him more slowly what had just transpired, what exactly "Pet Sounds" was, etc.
The next week in class when he handed out discussion question prompts, he wrote "Last week, Megan said X, which has led me to think about Y and its implications for Z." I am completely unashamed to say that I was beaming, so thrilled to be included in a Helgerson prompt. In truth, I had said nothing interesting or meaningful or provocative, but I think he saw in me someone who needed a little encouragement, who had a lot of ideas and not yet the ability to articulate or connect them. I still have that slip of paper.
What was mesmerizing about Richard in those last few months before he died was his grace in the classroom, the absolute thoughtfulness with which he considered every question or comment, the elegance in which he chose to talk about his illness. He was calm. I had never in my young life met someone so calm and soft-spoken in the face of what was to me at the time, something so monstrous and unfair. It was almost unnerving.
Once day in class, a graduate student baked all sorts of treats for Richard and the rest of us. She was an excellent and avid baker, and the aroma of rich chocolate brownies and sugar cookies filled the classroom. When Richard entered, he sat down, and looked at the dessert on the table. He was delighted, said, "Thank you, Annie. That looks so good; I wish I could eat it," and then proceeded to tell us that so many wonderful people in his life had been bringing food to the house that smelled so good and that ... he just couldn’t eat it because ... he just couldn’t eat much anymore.
We felt sick (poor Annie, most of all). Nobody said anything, but I will never forget that day in class. For three hours, the food sat at the center of our graduate table like a looming presence, and in an unspoken solidarity, nobody ate a crumb. There were other small moments like this during that quarter, all equally painful.
Sometimes I wanted to scream in class, Why are you here?! Why are you in classroom teaching silly first-years about Shakespearean sonnets when you should be making the most of your time left?! Go to the beach! Go whale watching! Once in his office, I kind of stumbled into this thought, to which he asked if I was O.K., if I felt uncomfortable in his class. The selflessness of his concern was almost unbearable. I said of course not, but the truth was that I was scared. It hurt to watch him deteriorate week to week.
But Richard was exactly where he should have been. He was with some of the great loves of his life: Milton, Shakespeare, Spenser. And I suppose one does not get tired of their great loves, but rather, in the face of death, devotes oneself to them mercilessly, reads the lines that have kept one spellbound in wonder since a young age.
Richard died a few weeks after our last class. When I went to Richard’s memorial at UCSB, there were many peers and former students who spoke of his generous nature, his kind eyes, his trips to Italy, his devotion as a father, husband, teacher, and scholar. I sat there, knowing that if I felt so terrible after knowing Richard for only a few months, how devastated his own graduate students and colleagues must be, who he had worked with for so much longer. I still think about Richard very often, perhaps too often for how long I knew him. But in my sixth and last year of graduate school, I reflect with affection on that difficult, transformative first year. I think Richard had the gift of being the mentor that each student needed at whatever stage they were at in their academic development, and he met his students there.
For me, at the risk of romanticizing (but what do I care, really), he taught me not only about the power of mentorship, but also adulthood, how to be a real person in a classroom. And whenever I become cynical about academe, the early professionalization, the politicization of the humanities, the defunding of foundational departments characterized as irrelevant, I think of Richard. I think of sitting on the beach in Santa Barbara, reading Shakespeare’s “young man” or “fair youth” sonnets for the first time, marveling over them. I think of Richard as someone who studied literature, first and foremost, because he loved language, and who, I hope, went gently into that good night. I remember thinking something odd in Richard’s class. I thought, I want to die like Richard. This is how a good person learns to die: brave, thoughtful, with gratitude.
Megan Fernandes is adjunct assistant professor of modern culture and media at Brown University.