Charleston Southern University has been facing widespread criticism online for firing a professor for, he says, allowing his image to be used on a beer can for a fund-raiser, apparently offending the anti-alcohol stance of some at the Baptist university. On Tuesday, the university issued a statement criticizing the coverage of the dismissal of Paul Roof, whose former students and others say he was among the best faculty members at the university, The Post and Courier reported. Most articles have suggested that it was the image of a professor on a beer can that caused the problem, not that Roof was fired either for drinking or having a beard (he is known for his beard). The Charleston Southern statement doesn't say why Roof was fired, but says that "it should be clear that the matter involving Dr. Roof is, in no way, premised upon the actual consumption of alcohol. Second, the matter has nothing to do with the presence of facial hair. Thirdly, charitable activities among faculty and staff are encouraged by the university."
Roof responded on his Facebook page, saying: "You are old school friends, colleagues, students, business owners, church members, beard community friends, and much more from all across the city, the state and the country. Community is about helping each other in times of need. You have been here for me and I will be there for you."
Nietzsche’s injunction is terse and direct, but simple it isn’t. Just about the most hopelessly off-target paraphrase possible would be that familiar bit of advice to anybody facing a socially anxious situation: “Relax and just be yourself!” The philosopher has something altogether more strenuous in mind: an effort in which “what you are” includes both raw material and the capacity to shape it. The athlete, musician, or artisan is engaged in such a process of becoming -- the strengthening, testing, and refining “potentials” that can barely be said to exist unless strengthened, tested, and refined.
Nietzsche’s influence on Sigmund Freud has always been a vexed matter. (Perhaps especially for Freud himself, who always denied that there was one, despite abundant evidence to the contrary.) Adam Phillips avoids the question entirely in Becoming Freud: The Making of a Psychoanalyst, a new title in the Yale University Press series Jewish Lives. The omission seems doubly odd given that Phillips himself is a psychoanalyst: Freud’s repeated but not quite credible insistence that he'd never been able to read more than half a page of the philosopher’s work sure does look like a symptom of, to borrow Harold Bloom’s expression, the anxiety of influence.
Originally presented as the Clark Lectures at the University of Cambridge earlier this year, Becoming Freud makes no claim to compete with the major biographies by Ernest Jones and Peter Gay. The annual lecture series (begun in the 19th century to honor a Shakespeare scholar who was a fellow at Trinity College) is dedicated to aspects of literature. But the book touches on Freud’s literary interests only intermittently.
Warrant for discussing the founding patriarch of psychoanalysis in the same venue where T. S. Eliot lectured on metaphysical poetry lies, rather, in the status of Freud’s work. It is “of a piece," Phillips says, "with much of the great modernist literature, all of which was written in his lifetime; a literature in which — we can take the names of Proust, Musil, and Joyce as emblematic — the coherent narratives of and about the past were put into question … [during] a period of extraordinary energy and invention and improvisation.”
At the same time, Freud’s participation in the upheaval was not a matter of choice or preference. He showed “little interest in contemporary art, and was dismissive of Surrealism, which owed so much to him; he had no interest whatsoever in opera or music, something of a feat in the Vienna of his time.”
The case studies he published bore proper medical titles (e.g., “Notes Upon a Case of Obsessional Neurosis” or "Analysis of a Phobia in a Five-Year-old Boy”) and presented what Freud considered rigorous methods for a scientific understanding of the human psyche. But they read like short stories or novellas, and are now usually remembered for the pseudonyms assigned to the patients (“the Rat Man” and “Little Hans,” respectively) whose stories Freud tells and interprets. He wrote the papers as technical literature, not “creative nonfiction,” and blurring of genres troubled him. Getting the ideas taken seriously by his peers was hard enough without being taken as an experimental author as well.
A fluent and renowned essayist in his own right, Phillips has a knack for aphorisms and apothegms that, after a few pages, tends toward a rather oblique mode of accessibility. It’s been said that while his work always feels brilliant while you’re reading it, that’s the only thing you can remember about it afterward. And there is something to the complaint, much of the time. Becoming Freud is an exception, I think. The chapters add up in a way that his essays, when collected between covers, generally do not.
The book assumes at least some familiarity with Freud’s own life and work, as well as an immunity to caricatures of them. That thins out the potential audience considerably. But for the reader with a little traction, Becoming Freud is one of the more suggestive books on its subject to come along in a while.
The author takes as a central point Freud’s hostility to biography -- expressed in his late 20s, well before establishing himself professionally, let alone developing new ideas. A biographer gathers up documents and recollections, and assembles them into causal sequences revealing the shape and coherence of someone’s life. Which is not just a presumptuous task but one vulnerable to all the tricks of memory and private agendas (acknowledged and otherwise) of everyone involved.
“This, for Freud, would be faux psychoanalysis,” writes Phillips. “Freud revealed to us that when it comes to motive no one can speak for anyone else. And that more often than not people resist speaking on their own behalf.” What they do instead is to come up with stories, explanations, and assumptions that seem to make life coherent, at the risk of trapping them into "buried-alive lives” — both driven and burned out by "the inextricability of their ambitions and their sexuality.”
The alternative, of course, is analysis. Just for the record, I am not quite persuaded by that claim. (Karl Kraus’s remark that psychoanalysis is the very disease that it pretends to cure seems a lot more on the money, pardon the expression.) But Freud's fundamental insight retains its force: people are, in Phillips’s words, “the only animals that [are] ambivalent about their development,” that “longed to grow up” but "hated growing up, and sabotaged it.”
Freud's patients came from that portion of the population which could not find a practical way to combine desire, frustration, and misery in socially acceptable ways. And as a Jew working in Vienna (the city that elected a candidate from the Anti-Semitic League as mayor in 1896, while Freud was deep in struggle with his own emotions following his father’s death) he may have been at the perfect vantage point to develop his understanding of modern life as a process that, Phillips writes, "selected out the parts and versions of the individual that were unacceptable to the state and left the individual stranded with whatever of himself didn’t fit in.” The personality becomes a regime "in which vigilant and punitively repressive authorities are in continual surveillance.”
Becoming Freud doesn’t narrate the development of psychoanalytic ideas or try to put them in social and cultural context; or rather, it does so only incidentally. It is primarily a book how Freud became someone able to think such thoughts, in such a context (how he became what he was) despite all the resistance that effort always generates. The book ends with its subject at the age of 50, with most of 35 more difficult and productive years ahead of him. I hope the author finds an occasion to write about those later decades — about how Freud occupied and managed what he had become.
In today’s Academic Minute, David Kaplan, professor of biomedical engineering at Tufts University, discusses the potential benefit of replacing metal with silk in surgeries. Learn more about the Academic Minute here.
To: Dean of College of Liberal Arts and Social Sciences
Subject: Trigger Warnings
In order to anticipate potential liability issues rising from the teaching of humanities and social science courses, we have reviewed the syllabuses across your college’s departments, with particular attention given to the impacting of racial and ethnic themes on our clientele’s (aka students’) emotional well-being. We have provisionally concluded that the English department can continue to teach The Adventures of Huckleberry Finn and The Merchant of Venice, while taking into careful consideration the sensibilities of African-American, Jewish and related niche audiences.
But in the course of our investigation, we found other reasons to anticipate future legal and public relations challenges for the university. With the support of the offices of student services and marketing and communications, which coordinated several focus groups, we found several books that could become the subject of class action suits. Please find below five examples from our full list that, if present campus trends continue, will raise red flags.
Homer's The Iliad and The Odyssey
Students were disturbed by Homer’s “relentless” depiction of mayhem and gore: “Like the X-Men franchise, but Wolverine is definitely a more likable mutant than Achilles,” concluded one respondent. Several students objected to the treatment of women -- mostly relegated to domestic activities or war booty -- and demanded to know if there were other epic poems by blind Archaic Greek bards that offered examples of female empowerment.
Also, a small but vocal number of students wearing PETA t-shirts protested the “inhumane” treatment of the dog Argo, left to die on a dung heap. Given the youthful impressionability of our customer base, we find potential problems with the Lotus-eater episode, as well as the character Helen’s liberal use of pharmacological agents.
Anonymous' "The Book of Job"
“Are you sure this is part of the Bible?” asked many respondents, who also exhibited intense unease with God’s actions, as they did with Job’s questions. The mounting suspense in waiting for God to reply adversely impacted many students (as did the irritation factor supplied by Job’s friends).
While the groups’ expectations were raised when a voice came from the whirlwind, they were deflated by the voice’s answers -- which, according to one respondent, weren’t answers at all. (“Like my parents, only worse.”) At the end of the session, a palpable sense of dread, along with isolated cases of fear and trembling, were in evidence -- all matters of concern for our office.
Though we were informed this work combines the two “Homeric” poems in one, the focus groups concluded it was somehow longer. Respondents were disturbed by the negative depiction of the character Dido -- “If she, like, died ‘before her time,’ how fair is that?” -- while the character Juno also elicited negative comments: “Clearly the product of a harsh patriarchal society determined to depict independent women as hysterical and dangerous.”
More generally, respondents were disoriented by Virgil’s habit, in the words of one participant, “to undermine the Roman values he pretends to uphold.” We find sufficient grounds for concern that students might argue they cannot be expected to give clear answers on their final exam if Virgil could not give any in his final poem. Our staff also suggests that more litigious individuals will claim that if Virgil could leave his poem unfinished, they could do the same with their exam.
Machiavelli's The Prince
Several students spoke of their emotional distress after reading the author’s claim that if a ruler obeys “something resembling good it will lead to his ruin, while something resembling vice will lead to power.” Other students, however, announced their decision to run for president of their fraternity and sorority chapters.
Significant liability potential resides in the author’s use of Cesare Borgia as a role model: his praise of Borgia’s public “dicing and slicing” (in one participant’s phrase) of a subordinate does not reflect the “brand” values of our university.
Our office for students with special needs signaled its concern over the presence of two characters with disabilities -- they lost “their shanks in the Ardennes” -- who are confined to garbage pails. The office also worries that two other characters -- one who cannot sit down, the other who cannot stand up -- appear indifferent to this situation.
We cannot decide which is more problematic for the university: those respondents left despondent by the play’s existential desolation, epistemological doubts and ethical despair, and those respondents who kept giggling. In general, it remains to be seen whether, when it comes to the trigger warning controversy, we can’t go on or must go on.
Rob Zaretsky is a professor of French history at the University of Houston's Honors College and author, most recently, of A Life Worth Living: Albert Camus and the Quest for Meaning.
“What would the United States look like if we really gave up on liberal education and opted only for specialized or vocational schools? Would that really be such a bad thing?”
The interviewer was trying to be provocative, since I’ve just written a book entitled Beyond The University: Why Liberal Education Matters. What exactly would be the problem, he went on, if we suddenly had a job market filled with people who were really good at finance, or engineering, or real estate development?
Apart from being relieved that he hadn’t included expertise in derivatives training in his list of specializations, I did find his thought experiment interesting. Would there be real advantages to getting students to hunker down early into more specific tracks of learning? In that way they would be “job ready” sooner, contributing more quickly to the enterprises of which they are a part, and acquiring financial independence at the same time. Would that really be such a bad thing?
The debate between those who want students to specialize quickly and those who advocate for a broad, contextual education is as old as America itself. The health of a republic, Thomas Jefferson argued, depends on the education of its citizens. Against those arguing for more technical training, he founded the University of Virginia, emphasizing the freedom that students and faculty would exercise there. Unlike Harvard University and its many imitators, devoted to predetermined itineraries through traditional fields, he said, Virginia would not prescribe a course of study to direct graduates to “the particular vocations to which they are destined.”
At Mr. Jefferson’s university, “every branch of science, useful at this day, may be taught in its highest degree.” But who would determine which pursuits of knowledge would prove useful?
Jefferson, a man of the Enlightenment, had faith that the diverse forms of learning would improve public and private life. Of course, his personal prejudices limited his interest in the improvement of life for so many. However, his conception of “useful knowledge” was capacious and open-ended – and this was reflected in his design for the campus in Charlottesville. He believed that the habits of mind and methods of inquiry characteristic of the modern sciences lent themselves to lifelong learning that would serve one well whether one went on to manage a farm or pursue a professional career. It is here we see the dynamic and open-ended nature of Jefferson’s understanding of educational “usefulness.”
His approach to knowledge and experimentation kept open the possibility that any form of inquiry might prove useful. The sciences and mathematics made up about half of the curriculum at Virginia, but Jefferson was convinced that the broad study of all fields that promoted inquiry, such as history, zoology, anatomy and even ideology would help prepare young minds. The utility was generally not something that could be determined in advance, but would be realized through what individuals made of their learning once outside the confines of the campus. The free inquiry cultivated at the university would help build a citizenry of independent thinkers who took responsibility for their actions in the contexts of their communities and the new Republic.
Jefferson would have well-understood what many business leaders, educators and researchers recognize today: that given the intense interconnection of problems and opportunities in a globalized culture and economy, we require thinkers who are comfortable with ambiguity and can manage complexity. Joshua Boger, founder of Vertex Pharmaceuticals (and chair of the board at Wesleyan University), has pointed out how much creative and constructive work gets done before clarity arrives, and that people who seek clarity too quickly might actually wind up missing a good deal that really matters. Boger preaches a high tolerance for ambiguity because the contemporary world is so messy, so complex.
Tim Brown, CEO of IDEO, one of the most innovative design firms in the world, has lamented that many designers “are stuck with an approach that seems to be incapable of facing the complexity of the challenges being posed today.” He calls for a flexible framework that leaves behind static blueprint preparation for “open-ended, emergent, evolutionary approaches to the design of complex systems can result in more robust and useful outcomes.” Like many CEOs across the country, Brown recognizes that more robust and useful outcomes will come from learning that is capacious and open-ended -- from liberal education.
At the Drucker Forum last year, Helga Nowotny, president of the European Research Council, described what she called the “embarrassment of complexity” – efforts based in data analysis to dissolve ambiguity that lead to more conformity and less creativity. She called for an ethos among business and government leaders that would instead “be based on the acknowledgement that complexity requires integrative thinking, the ability to see the world, a problem or a challenge from different perspectives.” That’s a call for integrative thinking based in liberal learning.
In America, liberal education has long been animated by the tension between broad, open-ended learning and the desire to be useful in a changing world. Calls for dissolving this tension in favor of narrow utilitarian training would likely produce just the opposite: specialists unprepared for change who will be skilled in areas that may quickly become obsolete.
So, what would America look like if we abandoned this grand tradition of liberal education? Without an education that cultivates an ability to learn from the past while stimulating a resistance to authority, without an education that empowers students for lifelong learning and inquiry, we would become a cultural and economic backwater, competing with various regions for the privilege of operationalizing somebody else’s new ideas. In an effort at manic monetization without critical thinking, we would become adept at producing conformity rather than innovation.
The free inquiry and experimentation of a pragmatic liberal education open to ambiguity and complexity help us to think for ourselves, take responsibility for our beliefs and actions, seize opportunities and solve problems. Liberal education matters far beyond the university because it increases our capacity to shape a complex world.
The California State University System is planning to hire 700 full-time tenure-track faculty members, reversing a decline in the number of positions in recent years, The Los Angeles Times reported. From 2008 to 2013, the number of faculty members either tenured or tenure-track fell from 10,700 to 9,800 -- while enrollment and the use of adjuncts increased. With a better state budget picture, Cal State hopes to reverse that trend, although the hiring would not restore 2008 levels, even though enrollment is up.