Humanities

Essay questions the push to put presidential libraries on campuses

At the recent dedication of the $500 million George W. Bush Presidential Center at Southern Methodist University, President Clinton called it "the latest, grandest example of the eternal struggle of former presidents to rewrite history." In 2004, the Clinton Center and Foundation stunned with its more than $200 million price tag, and less than a decade later Bush has doubled that when the endowment for the Bush Institute is counted. When the Barack Obama center opens around 2020, perhaps on the campus of the University of Chicago, could it be the first billion-dollar presidential center? Possibly. A total of $1.4 billion was raised for Obama’s two successful presidential campaigns, and so for a center dedicated to his final campaign for a better place in history it’s at least likely that he’ll surpass previous records.

Although the final decision on the location of the Obama center is probably a couple of years away, professors and administrators at the University of Chicago (where he once taught) and the University of Hawaii (where his mother studied and his sister taught) are thinking about what it might mean if it lands on their campus. Chicago State University also wants to be considered. For universities, presidential centers present both opportunities and significant costs and challenges. Academics should consider carefully before getting into a bidding war over a presidential library, and weigh how much these centers promote spin in addition to scholarship.

Prime campus real estate is sometimes sacrificed for these presidential temples, which, although they house valuable historical records impartially managed by the National Archives, also have museums that high school students who have passed the Advanced Placement U.S. History test would likely find biased, as well as foundations or institutes that have agendas that the host university does not control.

Clinton was right in saying that these centers are attempts by former presidents to write their own history and polish their reputations. And to a significant degree they work. President Carter’s reputation was tarnished when he left office in 1981, but as The New York Times put it in a nearly prescient headline in 1986: "Reshaped Carter Image Tied to Library Opening" — and today, Carter is one of the more respected former presidents.

But Clinton exaggerated when he said that the struggle by former presidents to remake their images stretches back to the beginning of American history. Until the 20th century, former presidents rarely even wrote memoirs, and the first president to have a presidential library run by the federal government was Franklin D. Roosevelt. The Roosevelt Library, which opened on his estate at Hyde Park, New York, in 1941, was modest compared with succeeding presidential libraries. Its initial cost was about $7 million in today’s dollars, but critics still accused FDR of building a "Yankee pyramid." There was more than a grain of truth in the charge. When FDR first saw Egypt’s pyramids, he said, "man’s desire to be remembered is colossal." Although what Roosevelt said may not be true for everyone, it certainly was true for FDR and his successors.

Most succeeding presidential libraries dwarf FDR’s: The Harry S. Truman Library in Independence, Missouri, evokes Queen Hatshepsut’s Temple in Egypt, as well as being the first to feature a full-scale Oval Office replica (something copied by most of the others), and the Dwight D. Eisenhower Library in Abilene, Kansas, is a complex of buildings with a park that takes up an entire city block.

 

 

The first president to affiliate his library with a university was President Kennedy. JFK envisioned his library on the grounds of his alma mater, Harvard University. After Kennedy’s death some at Harvard decided they didn’t like the idea of common tourists on their campus (99 percent of the visitors to presidential libraries are tourists, and only 1 percent are researchers), and architecture critic Ada Louis Huxtable humorously lampooned their fear of "Goths overwhelming the intelligentsia." Harvard did establish the Kennedy School of Government, but the Kennedy Library itself was located on a campus of the University of Massachusetts, on a spectacular site overlooking Boston harbor.

The Kennedy Library was also the first to have a "starchitect," when Jackie Kennedy chose I.M. Pei — who later designed the East Building of the National Gallery of Art, as well as the expansion of the Louvre — to design her husband’s memorial. Originally, the Kennedy Library was going to be a large pyramid with the top cut off — representing JFK’s tragically truncated achievement — but eventually that plan was scrapped, and Pei reimagined that design as the glass pyramid at the Louvre. Pei’s final design for The Kennedy Library and Museum was a futuristic glass, steel, and concrete edifice that still looks like it could be used in a Star Trek movie.

President Lyndon Johnson, with Lady Bird Johnson’s help, also hired a star architect for his monument to himself. Gordon Bunshaft of the famous Skidmore, Owings, and Merrill firm had designed such modernist icons as Yale University’s beautiful Beinecke Library with its translucent marble walls. Bunschaft’s design for the Johnson Library on the campus of the University of Texas at Austin has, as Ada Louis Huxtable wrote, "a Pharaonic air of permanence" that "puts Mr. Johnson in the same class as some Popes and Kings who were equally receptive clients for architects with equally large ideas." The Johnson Library looks like a cross between an Egyptian pylon temple and a space-age bureaucracy.

We could talk about award-winning architect James Polshek’s design for the Clinton Center, or the renowned Robert A. M. Stern’s imposing design for the Bush Center at SMU, but you get the idea. All presidents since FDR have an edifice complex. Becoming a patron of a huge architectural project dedicated to yourself is one of the perks of being an Imperial Ex-President. Another perk is becoming a museum curator. Initially, the exhibits in presidential libraries are campaign commercials in museum form, designed with a lot of help from the former president. Eventually these exhibits become more balanced and complete, but it’s usually 30-50 years after a president leaves office before the National Archives installs decent exhibits. The former president and many of his supporters need to die before their power to spin subsides.

As Wayne Slater of The Dallas Morning News writes, the new Bush museum is "a vivid justification for why certain decisions were made," rather than a balanced examination of the real options involved and the costs of presidential choices — such as the decision to invade Iraq. Bush avoids presidential mistakes in his museum, which means, as columnist Maureen Dowd of The New York Times writes, "You could fill an entire other library with what’s not in W’s." Bush is just the latest in a long line of presidents to create self-serving exhibits seen by millions. President Obama will likely follow this tradition.

Supporters of presidential libraries hail their archives with their raw materials of history open to scholars, journalists, and even school kids. But these records would be available anyway because by law they are owned by the American people and must be impartially administered and released by the National Archives. If a president didn’t have a presidential library, the records would be housed in an equally accessible facility (probably in Washington), it just wouldn’t be so architecturally grandiose.

It was Jimmy Carter who first morphed the presidential library into a presidential center. The Carter Center, which is next to but administratively separate from the Carter Library and Museum in Atlanta, has been so effective at living up to its mantra of "Waging Peace. Fighting Disease. Building Hope" that President Carter won the Nobel Peace Prize in 2002. But Carter has also generated considerable controversy over the years because of his views on Israel. If the Carter Center had been located on the campus of nearby Emory University (with which it is loosely affiliated) that institution’s reputation might have been affected, but since the Carter Center is geographically separate from Emory the university was largely shielded.

There is not as much shielding for SMU from former President Bush and his views on such issues as enhanced interrogation techniques. The Bush Institute was inspired in part by the Hoover Institution on the campus of Stanford University, which is considered one of the nation’s leading conservative think tanks. The Hoover Institution has long offered a platform for high-profile Republicans such as George Schultz, Condoleezza Rice, and Donald Rumsfeld.

The Hoover Institution is to a large degree administratively separate from Stanford, and so although it effectively leverages the prestige of its host university to expand its influence, Stanford does not have a corresponding control over it. It’s possible that President Obama will seek a similar arrangement with a host university for a future Obama Center, or whatever he might choose to call it.

And the bottom line here is the bottom line: Although the price tag for the actual building of the Bush Library, Museum, and Institute was a cool quarter of a billion dollars, an equal amount was raised to endow the Bush Institute. And Bush and his supporters will continue their aggressive fund-raising for the foreseeable future, making the ultimate price tag and influence of the Bush Center perhaps in the billion-dollar range sometime in the next decade or two.

When President Johnson helped found the LBJ School of Public Affairs at the University of Texas at Austin, he gleefully anticipated breaking what he called "this goddamned Harvard" hold on top government positions. But like the Kennedy School of Government at Harvard, the Johnson School is run by its university, not by a self-perpetuating board largely independent of the university that seeks, in part, to enhance the reputation of the president whose name is on the building. In other words, as presidential centers have evolved and grown they have become a better and better deal for former presidents, but it’s less certain that they are a good deal for the universities that might host them.

What would make a presidential center a better deal for a university and the public? It would be useful for the 99 percent who will visit the future Obama museum to encourage the involvement of some history professors at the host university to help create exhibits with rigorous content. This content should be of a quality that would actually help future high school students pass the relevant portion of a future AP U.S. history test, rather than just being a museum of spin.

For a future Obama foundation or institute, it would be worthwhile for the university to have a significant number of faculty members from a variety of departments on the governing board. The university should have more than token input into a foundation that will be a big player on campus for many decades, perhaps even centuries. For, as some have noted, these presidential centers have become the American equivalent of the temples and tombs of the pharaohs. If professors, students, and the general public are to be more than bystanders or even would-be political worshippers, the host university needs to negotiate for the best interests of not just the university but the American public. Universities should not simply acquiesce to the desire that Clinton spoke of (only half-jokingly) that presidents have to rewrite their own history in self-glorifying memorials.

And President Obama himself would need to be involved in the process of reforming the presidential center. He has to a degree already taken on this role, for in his first full day in office in 2009 he revoked President Bush’s infamous Executive Order 13233, which restricted access to presidential records for political reasons. Obama and the university he partners with should continue this work so that presidential centers cease to remind us of the lines of the poem by Percy Shelley: "My name is Ozymandias, King of Kings, Look on my works, ye Mighty, and despair!"

Ben Hufbauer is an associate professor of fine arts at the University of Louisville. He is the author of the book Presidential Temples: How Memorials and Libraries Shape Public Memory (University Press of Kansas).

Editorial Tags: 

Essay challenges advice given to would-be graduate students

Category: 
Tryo Tracts

Nate Kreuter questions the conventional wisdom of telling would-be graduate students who want to be professors to enroll only "if you can’t imagine yourself doing anything else."

Job Tags: 
Ad keywords: 
Editorial Tags: 
Show on Jobs site: 

Parody of a job ad for a faculty position

ME Studies

The department of English invites applications for a tenure-track assistant professor in ME Studies, starting Fall 2014. Applicants should demonstrate a sustained scholarly engagement with ME. Demonstrated expertise in one or more of the following areas is preferred: research I care about, topics I've been focusing on for years, theories I am familiar with, practices I approve of, and debates already settled by ME.

Successful applicants will be less successful than I am but not so unsuccessful that it reflects poorly on ME. The lucky chosen one will have the opportunity to work with ME. Candidates must have a Ph.D. from an institution I approve of and have recommendation letters from people I know and respect but am not threatened by. Please send just the names of people you know I know by October 15th.

My university is an Affirmative Action/Equal Employment Opportunity Employer and does not discriminate against any individual on the basis of age, color, disability, gender, national origin, religion, sexual orientation, veteran status or genetic information. However, applicants who cite ME are particularly encouraged to apply.

Mead Embry is a pseudonym for an English professor.

Editorial Tags: 

review of Ted Anton, 'The Longevity Seekers: Science, Business, and the Fountain of Youth'

Standing in line at the drugstore a couple of weeks ago, I spied on the magazine rack nearby this month’s issue of National Geographic – conspicuous as one of the few titles without a celebrity on the cover. Instead it showed a photograph of an infant beneath a headline saying "This Baby Will Live to Be 120."

The editors must have expected disbelief, because there was a footnote to the headline insisting that the claim was not hype: "New science could lead to very long lives." When was the last time you saw a footnote in a popular periodical, on the cover, no less? It seemed worth a look, particularly after the septuagenarian in front of me had opened complex, in-depth negotiations with the pharmacist.

The headline, one learns from a comment on the table of contents, alludes to a traditional Jewish birthday wish or blessing: "May you live to be 120." This was the age that Moses was said to have reached when he died. The same figure appears -- not so coincidentally perhaps – at an important moment in the book of Genesis. Before sending the Flood, Jehovah announces that man’s lifespan will henceforth peak at 120 years. (I take it there was a grandfather clause for Noah. When the waters recede, he lives another 350 years.)

The cap on longevity, like the deluge itself, is ultimately mankind’s own fault, given our tendency to impose too much on the Almighty’s patience and good humor. He declares in about so many words that there is a limit to how much He must endure from any single one of us. Various translations make the point more or less forcefully, but that’s the gist of it. Even 120 years proved too generous an offer – one quietly retracted later, it seems. Hence the Psalmist’s lament:

“The days of our years are threescore years and ten; and if by reason of strength they be fourscore years, yet is their strength labor and sorrow; for it is soon cut off, and we fly away.”

Nursing homes are full of people who passed the fourscore marker a while ago. If you visit such places very often, as I have lately, “May you live to be 120” probably sounds more like a curse than a blessing. Not even a funeral obliges more awareness of mortal frailty. There is more to life than staving off death. The prospect of being stranded somewhere in between for 30 or 40 years is enough to make an atheist believe in hell.

Meanwhile, in science…. The medical and biological research surveyed in that NatGeo article promises to do more than drag out the flesh’s “labor and sorrow” a lot longer. The baby on the magazine cover will live his or her allotted span of six score decades with an alert mind, in a reasonably healthy body. Our genetic inheritance plays a huge but not absolutely determinate role in how long we live. In the wake of the mapping of genome, it could be possible to tinker with the mechanisms that accelerate or delay the aging process. It may not be the elixir of youth, but close enough.

Besides treating the same research in greater depth, Ted Anton’s The Longevity Seekers: Science, Business, and the Fountain of Youth (University of Chicago Press) emphasizes how profound a change longevity research has already wrought. It means no longer taking for granted the status of aging as an inescapable, biologically hardwired, and fundamentally irreversible process of general decline. Challenging the stereotypes and prejudices about the elderly has been a difficult process, but longevity engineering would transform the whole terrain of what aging itself entails.

Anton, a professor of English at DePaul University, tells the story in two grand phases. The first bears some resemblance to James Watson’s memoir The Double Helix, which recounts the twists and turns of laboratory research in the struggle to determine the structure of DNA – work for which he and Francis Crick received a Nobel Prize in medicine in 1962. Watson’s book is particularly memorable for revealing science as an enterprise in which personalities and ambitions clash as much as theories ever do. (And with far more rancor as Watson himself demonstrated in the book’s vicious and petty treatment of Rosalind Franklin, a crystallographer whose contribution he downplayed as much as possible.)

A practitioner of long-form journalism rather than a longevity researcher, Anton writes about conflicts in the field with some detachment, even while remaining aware that the discoveries may change life in ways we can’t yet picture. The initial phase of the research he describes consisted largely of experiments with yeast cells and microscopic worms conducted in the 1990s. Both are short-lived, meaning that the impact of biochemical adjustments to their genetic “thermostats” for longevity would register quickly.

During the second phase of Anton’s narrative, lab research involved more complex organisms. But that that was not the most important development. The public began hearing news flashes that scientists had discovered that the key to a longer life was, say, restricted caloric intake, or a chemical called resveratrol found in red wine. Findings presented in scientific journals were reported on morning news programs, or endorsed on Oprah, within days or even hours of publication. Hypotheses became hype overnight.

This generated enthusiasm (more for drinking red wine than restricting calories, if memory serves) as well as additional confidence that biotechnological breakthroughs were on the way. Everybody in longevity research, or almost everybody, started a company and ran around looking for venture capital. Models, evidence, and ideas turned proprietary information -- with the hurry to get one’s findings into professional journals looking more and more like the rush to issue a press release.

So far, no pharmaceutical has arrived on the market to boost our lifespans as dramatically as the worm and yeast cells in the laboratory worms. “The dustbin of medical breakthroughs,” Anton reminds us, “bears the label ‘It Worked in Mice.’ ” On the other hand, the research has been a boon to the cosmetics industry.

As it is, we’re nowhere near ready to deal with the cumulative effect of all the life-extending medical developments from the past few decades. The number of centenarians in the world “is expected to increase tenfold between 2010 and 2050,” the author notes, “and the number of older poor, the majority of them women,” is predicted “to go from 342 million today to 1.2 billion by that same year.”

But progress is ruthless about doing things on its own terms. Biotech is still in its infancy, and its future course -- much less its side effects -- is beyond imagining. The baby on the magazine cover might well live to see the first centenarian win an Olympic medal. I wish that prospect were more cheering than it is.

Editorial Tags: 

Essay on crowdsourcing the humanities curriculum

Undergraduate students should join professors in selecting the content of courses taught in the humanities.

This is the conclusion I came to after teaching Humanities on Demand: Narratives Gone Viral, a pilot course at Duke University that not only introduced students to some of the critical modes humanists employ to analyze new media artifacts, but also tested the viability of a new, interactive course design. One semester prior to the beginning  of class, we asked 6,500 undergraduates -- in other words, Duke¹s entire undergraduate student body -- to go online and submit materials they believed warranted examination in the course.

Submissions could be made regardless of whether a student planned on enrolling in the course. In response, hundreds of students from a variety of academic disciplines, including engineering, political science, religion, foreign languages, anthropology, public policy and computer science, submitted content for the class.

This interactive approach, which I call Epic Course Design (ECD) after German playwright Bertolt Brecht’s theory of epic theater, represents a radical break with traditional course-building techniques. Generally, humanities instructors unilaterally choose the content of their syllabuses -- and rightly so. After all, we are the experts. But this solitary method of course construction does not reflect how humanists often actually teach.

Far from being viewed as passive receptacles of instructional data, humanities students are often engaged as active contributors. With this in mind, ECD offers a student-centered alternative to traditional course-building methods. Importantly, ECD does not allow students to dictate the content of a course; it invites them to contribute, with the instructor ultimately deciding which (if any) student-generated submissions merit inclusion on the syllabus.

Nevertheless, when a colleague of mine first heard about my plans to allow students to determine what was to be examined in Narrative Gone Viral, he was deeply skeptical: "But students don¹t know what they don’t know," he objected. In my view, that is not a problem -- that is the point; or at least part of it. For crowdsourcing the curriculum not only invites students to submit material they are interested in, but also invites them to choose material they believe they already understand. Student-generated submissions for Narratives Gone Viral included popular YouTube videos like "He-Man sings 4 Non Blondes," "Inmates Perform Thriller" and "Miss Teen USA 2007- South Carolina answers a Question." While my students were already exceedingly familiar with these videos, they clearly didn’t always see what was at stake in them.

All of these works are worthy of academic scrutiny: the "He-Man" piece is interesting because it confronts preconceived notions of masculinity; "Inmates Perform Thriller" prompts questions of accessibility to social media; "Miss Teen USA" is notable because it reveals how viral videos often appeal to a viewer’s desire to feel superior to others.

I am not proposing that all humanities courses should integrate this approach. What I am suggesting, however, is that ECD represents a viable alternative to more familiar course-building methodologies. This includes classes that do not focus on social media and/or popular culture. Importantly, whether students will be interested in suggesting texts for, say, a course on medieval German literature is not the crucial question; in my view, the crucial question is: Why should we refrain from offering motivated students the opportunity to do so, if they wish?

There was relatively little repetition in student submissions for Narratives Gone Viral, an indication that students were reviewing posts made by their peers, weighing their options, and responding with alternative suggestions.

To put a finer point on the matter, students were not merely submitting course content: they were discussing the content of a course that -- in every traditional sense -- had yet to even begin.

Michael P. Ryan is a visiting assistant professor of German studies and the American Council of Learned Societies new faculty fellow at Duke University.

Section: 
Editorial Tags: 

Essay on the importance of admitting when you mess up

Category: 

Nate Kreuter writes that the best thing you can do when you mess up is to admit it and ask for help.

Job Tags: 
Ad keywords: 
Editorial Tags: 
Show on Jobs site: 

Essay on how to keep humanities vibrant by rejecting elite universities' models

In "Howl," a blistering poetical rant and perhaps the most important poem of the 60’s counterculture, Allen Ginsberg anatomizes the minds of his generation. They are young men and women who "studied Plotinus Poe St. John of the Cross telepathy and bop kabbalah because the cosmos instinctively vibrated at their feet in Kansas." When students come to our offices to consider studying the humanities, we can all recite the litany of reasons for doing so. It provides them with the critical thinking skills needed for success in any career; it endows them with the cultural capital of the world’s great civilizations; and it helps them explore what it means to be human.

But for those of us who have spent our lives studying the humanities, such reasons are often just the fossilized remains of the initial impulse that set us on our educational journey -- the feeling that Kansas was vibrating at our feet, and that to chart our futures we desperately needed to understand the meaning of that vibration.

The main challenge for the humanities teacher has always been to show how the great works of philosophy, literature, religion, history, and art answer to the good vibrations in our young people. But at the dawn of the 21st century the academic scaffolding of the humanities thwarts this fundamental goal. The central problem is that the Harvard University model of humanistic study dominates academia.

The Harvard model sees the humanities as a set of distinct and extensively subdivided disciplines, overseen by hyper-specialized scholars who produce disciplinary monographs of extraordinary intellectual subtlety and technical expertise. Though the abstruse work produced with this model periodically makes it the butt of media jokes, no one with an appreciation for good scholarship would want to eliminate the rigorous discipline represented by the work of scholars at Harvard and institutions like it. But neither should it be allowed to dominate the agenda of all higher education, which it now incontestably does, to the detriment of both the humanities and the students who want to understand the meaning of their unique vibration.

The disciplining of knowledge was central to the creation of the modern research university. In the second half of the 19th century, Harvard and then schools across the academic landscape dropped their common curriculum, creating instead departments and majors. Beginning with the natural sciences of physics, chemistry, and biology, this flowering of disciplines issued in countless discoveries and insights with repercussions far beyond the university. Flushed with this success, this triumph of knowledge production, and the 19th century scientific methodology that was its seed, spread to the examination of society. The newly-invented social sciences -- economics, sociology, anthropology and the like — grabbed hold of the explosive new problems that followed in the wake of modern industrial life. But at the same time they marginalized the traditional questions posed in the humanities. The social sciences raised "humanistic" questions within the strictures of 19th century positivist assumptions about scientific "objectivity," and they have been doing so, despite post-modern blows dealt to claims of objectivity, ever since.

As the natural and social sciences divided the world between themselves the humanities threatened to become a mere leftover, a rump of general reflections and insights that lacked the rigor of the special sciences. Eager to be properly scientific themselves, and thereby forestall such a humiliating fate, the humanities disciplined themselves. They sought to emulate the success of the sciences by narrowing their intellectual scope, dividing and subdividing their disciplines into smaller and ever smaller scholarly domains, and turning themselves into experts.

The norm became the creation of inward-looking groups of experts who applied a variety of analytic approaches to sets of increasingly technical problems. In short, the humanities found themselves squeezed by the demands for professionalization and disciplinization, the need to become another regional area of study analogous in form, if not in content, to the other special sciences. And the humanities have been content to play this disciplinary game ever since.

In the last 30 years, the rise of Theory promised to breathe a new, post- modern life into this disciplinary game. By the mid-20th century, the sterility of old fashioned explication de texte was becoming apparent. The linguistic turn opened up a new way for the humanists to ape the rigor of the sciences while simultaneously extending their scholarly turf. In their zeal for technical rigor, they discovered to their delight that texts marvelously shift shape depending upon the theoretical language used in their analyses. Into the moribund body of the humanities flowed the European elixirs of psychoanalysis, phenomenology and hermeneutics, structuralism and post-structuralism, all of which boasted technical vocabularies that would make a quantum physicist blush. With these languages borrowed from other disciplines, the great books of the Western tradition looked fresh and sexy, and whole new fields of scholarship opened up overnight.

At the same moment, however, scholars of the humanities outside the graduate departments of elite universities suddenly found themselves under-serving their students. For the impulse that drives young people to the humanities is not essentially scholarly. The cult of expertise inevitably muffles the jazzy, beating heart of the humanities, and the students who come to the university to understand their great vibration return home unsatisfied. Or worse, they turn into scholars themselves, funneling what was an enormous intellectual curiosity through the pinhole of a respectable scholarly specialty.

Indeed, their good vibrations fade into a barely discernable note, a song they recall only with jaded irony, a sophisticated laugh at the naiveté of their former selves, as if to go to school to learn the meaning of their own lives were an embarrassing youthful enthusiasm. The triumph of irony among graduate students in the humanities, part of the deformation professionelle characteristic of the Harvard virus, exposes just how far the humanities have fallen from their original state. As they were originally conceived, the humanities squirm within the research paradigm and disciplinary boxes at the heart of the Harvard model.

The term "humanities" predates the age of disciplinary knowledge. In the Renaissance, the studia humanitatis formed part of the attempt to reclaim classical learning, to serve the end of living a rich, cultivated life. Whether they were contemplative like Petrarch or engaged like Bruni, Renaissance humanists devoted themselves to the study of grammar, rhetoric, logic, history, literature, and moral philosophy, not simply as scholars, but as part of the project of becoming a more complete human being.

Today, however, the humanities remain entrenched in an outmoded disciplinary ideology, wedded to an academic model that makes it difficult to discharge this fundamental obligation to the human spirit. Despite the threat of the Great Recession, the rise of the for-profit university, and a renewed push for utility the humanities continue to indulge their fetish of expertise and drive students away. Some advocate going digital, for using the newest techno and cyber techniques to improve traditional scholarly tasks, like data-mining Shakespeare. Others turn to the latest discoveries in evolutionary psychology to rejuvenate the ancient texts. But both of these moves are inward looking — humanists going out into the world, only to return to the dusty practices that have led the humanities to their current cul-de-sac. In so doing, colleges and univeristies across the country continue to follow the Harvard model: specialize, seek expertise, and turn inward.

When Descartes and Plotinus and Poe and St. John of the Cross created their works of genius, they were responding not to the scholar’s task of organizing and arranging, interpreting and evaluating the great works of the humanistic tradition, but rather to their own Kansas. Descartes and Rousseau were latter-day Kerouacs, wandering Europe in search of their souls. These men and women produced their works of genius through a vibrant, vibrating attunement to the needs of their time.

The Humanities! The very name should call up something wild. From the moment Socrates started wandering the Greek market and driving Athenian aristocrats to their wits end, their place has always been out in the world, making connections between the business of living and the higher reaches of one’s own thought, and drawing out implications from all that life has to offer. The genius of the humanities lies in the errant thought, the wild supposition, the provocation -- in Ginsburg’s howl at society. What this motley collection of disciplines is missing is an appreciation of the fact that the humanities have always been undisciplined, that they are essentially non-disciplinary in nature. And if we want to save them, they have to be de-disciplined and de-professionalized.

De-disciplining the humanities would transform both the classroom and the curriculum. Disengaging from the Harvard model would first and foremost help us question the assumption that a scholarly expert in a particular discipline is the person best suited to teaching the subject. The quality that makes a great scholar — the breadth and depth of learning in a particular, narrow field — does not make a great teacher; hungry students demand much more than knowledge. While the specialist is hemming himself in with qualifications and complications, the broadly-educated generalist zeros in on the vital nub, the living heart of a subject that drives students to study.

While a scholarly specialist is lecturing on the ins and outs of Frost’s irony, the student sweats out his future, torn between embracing his parent’s dream of having a doctor in the family or taking the road less traveled and becoming a poet. The Harvard model puts great scholars in charge of classrooms that should be dominated by great teachers. And if the parents who are shelling out the price of a contemporary college education knew their dollars were funding such scholarly hobbyhorses, they would howl in protest.

De-disciplining the humanities would also fundamentally change the nature of graduate and undergraduate education. At the University of North Texas Department of Philosophy and Religious Studies, located in the Dallas Metroplex, we are training our graduate students to work with those outside their discipline —  with scientists, engineers, and policy makers — to address some of the most pressing environmental problems the country faces. We call it field philosophy: taking philosophy out into the world to hammer out solutions to highly complex and pressing social, political, and economic problems. Graduate students participate in National Science Foundation grants and practice the delicate skill of integrating philosophic insights into public policy debates, often in a "just-in-time" manner. In class they learn how to frame and reframe their philosophical insights into a variety of rhetorical formats, for different social, political, economic purposes, audiences and time constraints.

At Calumet College of St. Joseph, an urban, Roman Catholic commuter college south of Chicago that serves underprepared, working-class Hispanic, African-American, and Anglo students, we are throwing the humanities into the fight for social justice. Here the humanities are taught with an eye toward creating not a new generation of scholars, but a generation of humanely educated citizens working to create a just society. At Calumet, students are required to take a social justice class.

In it they learn the historical and intellectual roots of Catholic social justice teaching within the context of performing ten hours of community service learning. They work in a variety of social service fields (e.g. children, the elderly, homeless, etc.), which exposes them to the real-life, street-level experience of social challenges. Before, during, and after, students bring this experience back to the classroom to deepen it through reflective papers and class discussion.

High-level humanistic scholarship will always have a place within the academy. But to limit the humanities to the Harvard model, to make scholarship rather than, say, public policy or social justice, the highest ideal of humanistic study, is to betray the soul of the humanities. To study the humanities, our students must learn textual skills, the scholarly operations of reading texts closely, with some interpretive subtlety. But the humanities are much more than a language game played by academic careerists.

Ultimately, the self-cultivation at the heart of the humanities aims to develop the culture at large. Unless they end up where they began -- in the marketplace, alongside Socrates, questioning, goading, educating, and improving citizens -- the humanities have aborted their mission. Today, that mission means finding teachers who have resisted the siren call of specialization and training undergraduate and graduate students in the humanities in the art of politics.

The humanist possesses the broad intellectual training needed to contextualize social problems, bring knowledge to bear on social injustice, and translate disciplinary insights across disciplines. In doing so, the humanist helps hold together an increasingly disparate and specialized society. The scholasticism of the contemporary academy is anathema to this higher calling of the humanities.

We are not all Harvard, and nor should we want to be.

Chris Buczinsky is head of the English program at Calumet College of St. Joseph in Whiting, Indiana. Robert Frodeman is professor of philosophy and director of the Center for the Study of Interdisciplinarity at the University of North Texas.
 

Section: 
Editorial Tags: 

Adjuncts angry over being excluded from vote by CUNY faculty union

Section: 
Smart Title: 

Some adjuncts are angry over decision by CUNY faculty union to exclude them from voting on resolution about controversial curricular changes.

Essay on the hours faculty members work each day

Category: 
Tyro Tracts

Most faculty members don't have the hours in a day to do everything they are supposed to do, writes Nate Kreuter.

Job Tags: 
Ad keywords: 
Editorial Tags: 
Show on Jobs site: 

Essay on students who are engaged

Books abound about student disengagement. We read about their apathy and indifference to the world around them. Data, sadly, support these claims. Youth voting rates are low, especially when President Obama isn’t on the ballot, and while there is some partaking in community activities, critics have noted that some of this engagement is the product of high schools "mandating" volunteerism as a graduation requirement.

My experiences – both as a political scientist and as a dean of the school of liberal arts at the Savannah College of Art and Design – suggest that we administrators and professors doth protest too much. Give our students a compelling text and topic, and they will engage.

I recently visited a philosophy class in which Plato’s Republic was assigned. The students were tackling Book Six, where questions spill off the pages about who should rule, and what qualities make for a viable ruler. Can a "rational" person, removed from impulses and passions, command and lead? How can, or should one remove oneself from temptation and emotion? Can the rational and emotive be separated? Do citizens trust those who are like them? How much of leading and governing is about the rational, and how much is about appearances and images?

As the professor and I raised these questions, I noticed immediately that the students had done the reading. We administrators read about how today’s students do not read. But these students – all of whom were non-liberal arts majors – had immersed themselves in the text. They were quoting passages and displaying keen interest, both in the text itself and the questions that were being raised. It is not surprising that Plato enlivened the classroom. But these future artists and designers recognized the power of the text. They appreciated how the words had meaning, and the questions were worth exploring.

Second, this experience, and others like it, gave me pause. We administrators may need to tweak our conceptions of our students. Sure, Academically Adrift is an important book, and yes, the data show that the degree of reading comprehension has declined. But we should not misconstrue that data as tantamount to disengagement, nor should we assign fewer readings, simply imply because there are data that show many students do not complete reading assignments. This recommendation – of assigning less reading and teaching it in greater depth – was one of the suggestions made by José Antonio Bowen, author of Teaching Naked, in his dynamic and imaginative keynote address at this year’s annual meeting of the Association of American Colleges and Universities.

The point here is not to debate Bowen’s recommendation – that is for another time and place. Similarly, I am well aware that this experience in Philosophy 101 may be unique, and is dubiously generalizable. (I should add that encountering students who are excited about discussing big ideas also occurs in other classrooms -- photography and art history, for example, that I have visited as well.)

This enthusiasm is not a recipe for assigning Plato in every class, although that is an idea that most definitely would generate discussion. That written, I believe that we should reconsider how we administrators and educators think about student engagement. It is more than knowledge about civics and current events. It is bigger and deeper than service learning, or a passion to work in one’s community.

Provide students with a compelling text and a professor who knows how to raise thought-provoking questions, and students will ponder, debate and imagine the world in new and different ways. They will learn how to think critically and creatively. Cultivating that form of student engagement is no easy task, but it begins by exposing students to great texts and great ideas. Engagement is more than a form of political participation. It is the core of the liberal arts.
 

Robert M. Eisinger is dean of the School of Liberal Arts at the Savannah College of Art and Design.

Section: 
Editorial Tags: 

Pages

Subscribe to RSS - Humanities
Back to Top