Over the last few years, there has been no shortage of news coverage and commentary remarking on the seemingly real or perhaps only greatly exaggerated death of the liberal arts in American higher education.
We are not alone in thinking that the debate about the relevance of the liberal arts is tired and simplistic. To our minds, the liberal arts are as relevant as ever -- as a means of enriching lives, developing engaged citizens and nurturing foundational professional skills.
But if the public, rightly or wrongly, is becoming increasingly skeptical of the value of the liberal arts -- and enrollment trends at certain institutions would suggest that they may be, at least in some measure -- then schools of liberal arts will have to accept some share of the blame themselves.
Undoubtedly, public pronouncements arguing that we need “more welders and less philosophers,” as former presidential candidate Marco Rubio claimed late last year, irk many in the liberal arts -- and not solely because of Rubio’s poor use of grammar. This notion that liberal arts graduates are terminally unemployable is achieving a kind of -- to borrow Stephen Colbert’s famous neologism -- truthiness. And that kind of misinformation can be particularly frustrating to faculty members and students who have devoted their energy and enthusiasm to these fields of study, and enjoyed successful careers as a result of it.
The fact is, as researchers in the field of employability will tell you, a great many organizations have a real interest in hiring college graduates possessing communication and reasoning skills blended with technical expertise and strong character. In an Inside Higher Ed commentary from early 2016, Burning Glass CEO Matthew Sigelman argued that his firm’s research on labor demand has shown that many of the fastest-growing jobs are hybrid in character, requiring “people who can bridge domains and synthesize ideas.” Few would argue that the liberal arts don’t have a contribution to make in producing these sorts of graduates.
Still, frustrating though they may be, news headlines and political commentary aren’t the real obstacles to sustaining the future of the liberal arts. That challenge has less to do with media perceptions or careless politicizing than with the traditional organizational structures and curricular approaches of schools of liberal arts themselves. Here’s what we mean.
Departmental structures can be inflexible and inhibit creative responses to changing market expectations. At a number of liberal arts institutions we work with, faculty express great interest in interdisciplinary work and other forms of innovation. In some respects they find organizational structures -- the proliferation of schools, departments, divisions, units -- just as frustrating and inhibiting as administrators do. But when faculty become uneasy with the tenor of the public debate about the contribution of the liberal arts and feel threatened, they often rely on these structures as a bulwark against change. Others may resist on principle any movement that might be perceived as moving in the direction of vocationalism or focusing on work readiness associated with linking the liberal arts to professional programs.
In both cases, the result can be the same: faculty hunker down. They look at the growth of faculty lines in engineering or business and argue that their departments would grow, too -- if only similar investments were made in their faculty. Of course, increasing capacity doesn’t automatically increase enrollments. Yet for those individuals, the fight for resources is viewed as a zero-sum game, and some faculty members and department chairs would seek to preserve the structures that they know rather than risk reorganizing in ways that merge departments or explicitly require collaboration with the professional disciplines -- even if such changes might deliver more value to students. But of course, such mergers and collaborations are possible where adjacent disciplines complement one another -- such as writing and English programs or communications and performing arts. Restructurings of these sorts can not only avoid unnecessary redundancies in staff positions and other organizational overhead, but also foster the development of a more contemporary curriculum and enrich the student experience.
Departmental structures can constrain the evolution and effectiveness of general-education curricula. As the volume of majors in the liberal arts disciplines continues to fluctuate, general education programs may be seen as an increasingly powerful mechanism to promote traditional liberal arts values. But they can also offer students new forms of interdisciplinary intellectual exposure via minors or other ways of bundling sequences of courses.
For departments with declining majors, general-education course enrollments are frequently seen by faculty as crucial evidence of their value. As a result, there is often resistance by faculty members and department chairs to restructuring general-education programs in ways that might deviate from the more immediately measurable performance models based on numbers of department majors -- even if such restructurings may lead to more relevant and flexible curricula for students. For example, while the contemporary student may derive significant value from experiential learning components and interdisciplinary capstone courses, their inclusion in general-education programs is often met with resistance by faculty as they fall outside the traditional disciplinary or departmental structure.
Departmental structures can necessitate organizational workarounds, such as the creation of interdisciplinary liberal arts centers or institutes, to find a home for innovation. While interdisciplinary centers or institutes can serve as vital catalysts for innovation and collaboration across the disciplines, merely establishing them will not necessarily overcome the force of decades of departmentally focused priorities. As a result, these interdisciplinary centers can sometimes evolve into isolated interdisciplinary silos. Indeed, the lack or perceived lack of incentives for faculty involvement, a misalignment with departmental promotional criteria and the absence of clear expectations with respect to the roles that particular departments or disciplines are meant to play in these centers can all contribute to their eventual marginalization and failure -- which can make it even more challenging to recruit and retain high-potential faculty. Paying lip service to interdisciplinarity isn’t sufficient. In fact, it just exacerbates tensions between units and can make numerous departments less productive. What’s required is a commitment to interdisciplinarity and the centers that promote it as hubs of cross-discipline engagement, for faculty and students alike.
Our view is that the liberal arts matter. Why? Because they prepare students to reason and solve problems, because they develop critical communication skills, and because they teach students how to engage in a process of discovery -- whether it be intellectual discovery, self-discovery or professional discovery. If schools of liberal arts put these same skills to work in examining their own efforts and organizational structures, the liberal arts might well flourish.
Such schools would be more apt to bring together data analytics and the study of literature, or revolutionize the way they think about the role and contribution of general-education programs, or promote liberal arts minors for engineers and biologists in lieu of fighting for more majors within the liberal arts. They might, in other words, rethink the longstanding organizational structures that have housed -- and for many years nurtured -- the liberal arts, but which have now begun to constrain and limit their impact.
Peter Stokes is a managing director and Chris Slatter is a manager in the higher education practice at Huron Consulting Group.
Around this time 20 years ago, I met an elderly gentleman who’d had what sounded like an exceptionally interesting and unusual dissertation-writing experience. A couple of recent coincidences bring the encounter to mind and so inspired this little causerie.
His name was Harmon Bro, and he was in his late 70s when we met. He’d spent the better part of 50 years as an ordained minister and Jungian psychotherapist. If anyone ever looked the part of a Jungian archetype, it was Harmon, who personified the Wise Old Man. In 1955, the University of Chicago Divinity School awarded him a Ph.D. after accepting a doctoral thesis called “The Charisma of the Seer: A Study in the Phenomenology of Religious Leadership.”
It was based in part on work Harmon did in his early 20s as an assistant to Edgar Cayce, “the sleeping prophet.” Despite minimal education, Cayce, it is said, could give long, extemporaneous discourses in response to questions posed to him while he was in a trance state. Among these “readings” were medically sophisticated diagnoses of people miles or continents away, as well as detailed accounts of ancient history and predictions of the future.
Cayce died in 1945, but he left a vast mass of transcripts of his “readings.” By the 1960s, publishers were mining them to produce a seemingly endless series of paperback books extolling Cayce’s powers. Insofar as the New Age can be said to have founding figures, he was one of them.
Harmon was clearly a believer in Cayce’s miraculous powers. I was not (and am not) but have always enjoyed the legends by and about him. As a schoolboy, for example, he would put a textbook under his pillow and absorb its contents while asleep. He graduated (so to speak) to the Akashic Records -- an ethereal library documenting life on Atlantis and in ancient Egypt, and much else besides. He could also see into the future, but the track record is not impressive: China did not convert to Christianity in 1968, nor did Armageddon arrive in 1999. Cayce also predicted that an earthquake in the 1960s would cause California to sink into the Pacific Ocean. It remains attached to the continental United States as of this writing.
Harmon didn’t take skepticism as a threat or an insult, and anyway I preferred listening to arguing. He stressed how very improbable Cayce had been as a subject for serious scholarly attention in the 1950s -- at the University of Chicago, no less. It took three or four tries to get his topic approved; by the time the dissertation was finished and accepted, it felt like every faculty member concerned with the history and psychology of religion had weighed in on it. He happily lent me a copy (when anyone expresses interest in a decades-old dissertation, its author will usually have one of two responses: pleasure or horror), and from reading it, I could see that the scrutiny had been all for the best. It obliged him to practice a kind of methodological agnosticism about Cayce’s powers, and he demonstrated a solid grounding in the social-scientific literature on religion -- in particular, Max Weber’s work on prophetic charisma.
But by 1996, Harmon Bro was not at all happy with the institutions routinizing that charisma. The man he’d known and studied had an ethical message -- “love thy neighbor as thyself,” more or less. The New Age ethos amounted to “love thyself and improve thy karma.” You didn’t have to share his worldview to see his point.
The timing was fortunate: we grew acquainted during what proved to be the final year of Harmon Bro’s life. His obituary in the Chicago Tribune in 1997 made no reference to Cayce, but looking it up just now leaves me with a definite feeling of synchronicity: Harmon died on Sept. 13, which is also the date I’m finishing this piece. A message from Harmon, via the cosmic unconscious?
Probably not, although it was another and even more far-flung coincidence that reminded me of him in the first place. On Friday, the journal Nature Communication published a paper called “Terahertz time-gated spectral imaging for content extraction through layered structures,” which the science-news website EurekAlert kindly translates into laymanese as “Researchers prototype system for reading closed books.” Not by putting them under a pillow and sleeping on them, alas, but it’s impressive even so.
Researchers at the Massachusetts Institute of Technology and the Georgia Tech Institute of Technology collaborated in developing a system that uses bursts of terahertz radiation (“the band of electromagnetic radiation between microwaves and infrared light,” says EurekAlert) to create images of the surfaces of individual pieces of paper in a stack. Ink in a printed letter absorbs the radiation differently from the blank page around it; the contrast between the signals reflecting back are fed into an algorithm that identifies the letter on the page. The prototype can “read” the surfaces of up to nine pages in a pile; with more work, reading at greater depths seems possible. The story quotes one of the researchers as saying, “The Metropolitan Museum in New York showed a lot of interest in this, because they want to, for example, look into some antique books that they don’t even want to touch.” The signal-sorting algorithm may yet enable spambots to defeat captchas. (Which arguably represents grounds for halting research right away, though that is unlikely.)
The train of association between breaking technological news from last week and the memory of one of the more generous and unusual people to cross my path is admittedly twisty and random. On the other hand, reading by terahertz radiation seems like another example of Clarke’s Third Law: “Any sufficiently advanced technology is indistinguishable from magic.”
Submitted by Anonymous on September 13, 2016 - 3:00am
For graduating high school seniors who are entering college this fall, it is an exciting time. Possibilities have been opened! Yet now new concerns arise: Have they chosen the right college? Will they thrive?
These are hard questions for any young adult, but for those with autism, the stakes are especially high. A 2015 Autism Speaks report found that only 30 percent of high school graduates with autism ever attend a two- or four-year college, and those that do fare poorly. Research suggests that 80 percent of them never graduate. Furthermore, only 32 percent of high school graduates with autism find paying work within two years of graduating high school. This need not be. Half of all individuals with autism have average or above-average intelligence. They can do the work. The problem is not the students. It’s the colleges.
We come to this issue from an unusual perspective. One of us, Elizabeth, studies at Pasadena City College and has autism. The other, Margaret, teaches at California State University at Los Angeles, and -- in addition to being Elizabeth’s mother -- has worked with students on and off the spectrum. Together, we have seen the many ways that colleges fail students with autism.
Federal legislation, including the Americans With Disabilities Act, mandates that colleges provide reasonable accommodations for disabled students. But common accommodations, such as providing a quiet exam setting, don’t adequately address the problems faced by many students with autism.
As autism scholars Ernst VanBergeijk, Ami Klin and Fred Volkmar note, autism is a social disability. The inherent qualities of autism -- resistance to change, sensitive sensory systems, weakness at reading social cues and a tendency to take language literally -- interfere with communication and social engagement. A quiet exam room will not help students overcome those barriers. The problems students with autism face are more insidious.
Elizabeth, for example, struggles with understanding if professors are being sarcastic or rhetorical. Uncertain, she often responds too much or too little. When one professor expressed frustration at her eager hand raising, she asked privately if he would signal her when he wasn’t being serious or didn’t require a response. “No,” he said. “I don’t need to change my teaching for you, and you need to learn sarcasm.”
It would be easy to regard Elizabeth’s experience as exceptional, the product of one unsympathetic professor. Yet research out of Australia by Ru Ying Cai and Amanda L. Richdale confirms how common such experiences are. In focus groups, autistic college students told story after story about metaphorical or abstract language leading to confusion, as well as loud, active classrooms challenging their abilities to focus on learning. For many, the frustrations became too great, leading to stress, anxiety and regrettable outcomes. However, when students felt their social needs were met -- in particular when faculty members proved willing to modify their teaching style -- students had much more positive experiences.
But American professors are not required to modify their teaching style for disabled students, and colleges are not required to think about the social, communicative needs of any students, let alone those with autism. Those things are not considered reasonable accommodations. But if autism is indeed a social disability, then denying the social needs of autistic students is inherently unreasonable.
It would help if faculty members understood how autism affects learning. But professors are busy. They juggle many demands, and professional development is often low on their to-do lists. At Margaret’s university -- which houses an outstanding center for teaching and learning development -- professional development seminars are often poorly attended, especially those focused on helping students with special needs. At one seminar on working with hearing-impaired students, Margaret was one of three instructors to show up, and if our conversations with colleagues and peers are indicative, then Margaret’s experience is a common one. Even when given the opportunity to learn more about the needs of disabled students, professors turn those choices down.
Some positive changes are underway. More than 100 colleges now offer programs for students with autism, but most of them are private, expensive, residential programs. Meanwhile, research suggests that up to 80 percent of college students with autism at one point filter through community colleges, where students, often still highly dependent on family support, can live at home. Those institutions generally offer fewer resources for students with autism. If we are to meet the needs of neurodiverse students, public community colleges will need to lead the way.
In these days when most community college disability offices are underfunded -- Elizabeth’s community college does not even provide note takers -- meeting the needs of students with autism may seem daunting. But meaningful institutional changes do not need to strain budgets. For Elizabeth, the greatest support has often come from students who have chosen to act as social interpreters. A whispered word or two is often all she needs to better and more appropriately engage with her curriculum. Colleges like California State University at Fullerton already have mentorship programs that pair neurotypical and neuroatypical classmates.
We recommend expanding such programs so that peer mentors -- perhaps those offered the coveted privilege of priority registration -- work side by side with autistic students in the classroom. Of course, that brings us back to the privacy concerns voiced earlier. Peer mentors can only work with students who are willing to self-identify in the classroom as having autism, which is why autistic students themselves must also be involved in making campuses more responsive to their needs -- and that will only happen when students with autism bring neurodiversity into conversations about campus diversity.
Until that happens, faculty can do a lot to foster feelings of safety and inclusion for all students -- both with autism and without. Elizabeth advocates for simple kindness, acceptance and the understanding that some disabilities are invisible. In Margaret’s classes, she announces on day one that students registered with the school’s disability office should feel free to talk to her about not just the accommodations they may legally require but also about other things she can do to make her courses work for them. She shares -- with Elizabeth’s permission -- the struggles Elizabeth has faced in education, and she urges students to see her as someone who really wants to help them succeed.
Work by Nicholas Gelbar, Isaac Smith and Brian Reichow offers faculty members other suggestions for helping students on the spectrum: incorporate universal design into curriculum and assignments. As much as possible, use concrete language in both lectures and the syllabus. Break tasks down into more steps, provide greater organizational support, realize that group work, public speaking and active classrooms (such popular buzzwords in today’s curricular development) may offer particular challenges for students who struggle socially and who do not thrive in environments demanding rapid transitions. In other words, when dealing with students whose disability makes flexibility extremely difficult, faculty members must be the flexible ones. They must also take responsibility for educating themselves about neurodiversity, and if that seems too hard, they can do one last thing. They can defer to autistic students who do understand their own needs, and they can give those students the support they ask for.
One thing is undeniable: without significant changes, the traditional gateway to greater community inclusion and financial security will remain closed to people with autism. And that’s a tragedy, because those with autism have a lot to offer -- not just to our colleges, but also to our nation’s economy. We all win when everyone can compete and contribute.
Elizabeth Finnegan is a student at Pasadena City College. Margaret Finnegan teaches at California State University at Los Angeles. She is the author of Selling Suffrage: Consumer Culture and Votes for Women (Columbia University Press, 1999), and her work has appeared in College Communication and Composition, American Quarterly and other publications.
Doubleday published Aaron James’s thought-provoking little treatiseAssholes: A Theory of Donald Trump in early May, but I have not seen a single reference to the book since the candidate clinched the Republican nomination later that month.
In the meantime, several million pundit-hours of commentary have gone to assessing the presidential horse race, mainly by people who live at the track. James, by contrast, is a professor and chair of philosophy at the University of California, Irvine. His major work of scholarship to date, Fairness in Practice: A Social Contract for a Global Economy (Oxford University Press, 2012), was well received by his peers, though it has been largely overshadowed by his pioneering work in asshole studies.
Let us first define terms. What, then, o Socrates, is an asshole? And how does the asshole differ from someone who is just a jerk?
The distinction is important. “The asshole,” James writes, “is the guy (they are mainly men) who systematically allows himself advantages in social relationships out of an entrenched (and mistaken) sense of entitlement that immunizes him against the complaints of other people.” His sense of entitlement is absolute; his self-aggrandizing behavior is spontaneous and noticeably lacking in inhibition. The asshole may recognize that violating certain norms of acceptable behavior may cause pain or give offense but feels no conflict over that possibility.
The jerk, by contrast, is aware it is normal to apologize or express embarrassment -- and does so, sincerely or not. Someone parking in a handicapped parking space without the appropriate plates or sticker may be either a jerk or an asshole, but only the jerk will feel the need to come up with, at least, an excuse.
More important, the asshole will, James writes, often “feel indignant when questions about his conduct are raised. That, from his point of view, shows he is not getting the respect he deserves.” Just such an escalation -- from habitual, self-centered indifference toward the feelings of others to rage at even the perception of being slighted -- became familiar as part of Trump’s debating style throughout the Republican primary debates.
It proved effective, and that is the puzzle, which only deepened in the course of the summer. Somehow the candidate’s incessant and tireless asshole behavior (he has been at it for more than a year now, full time; even from this side of the process, it feels like 10) has never seriously damaged his base of support.
H. L. Mencken once defined a demagogue as someone “who preaches doctrines he knows to be untrue to men he knows to be idiots.” Trump has commanded the national stage with greater success than any demagogue since the 1930s, and yet Mencken’s quip is, as James points out, doubly insufficient in characterizing the candidate. For one, Trump is not so much dishonest as completely uninterested in whether or not what he says is true. (See Harry G. Frankfurt's On Bullshit [Princeton University Press, 2005].) Nor are Trump supporters all idiots. For many, James theorizes, “Trump’s value is mainly as a stratagem of asshole management: when stuck with heaps of assholes, turn to an even bigger, better asshole, in hopes of bringing order for public benefit …. In a system where officials routinely thwart the public interest, capitalizing on their position for power and profit, only an asshole so skilled as to school the other assholes properly, and so to awe them into submission, would restore order and peace, for the greater good of everyone.”
The asshole, so elevated and empowered, sounds quite a bit like the sovereign in Leviathan, which is no accident. Assholes: A Theory of Donald Trump offers quick tutorials on Hobbes and Rousseau to suggest that the candidate’s rise makes a certain amount of sense in the context of a republic collapsing under strain.
Support for Trump, by this reading, is the perverse and rather paradoxical effect of 30 years (arguably more) of growing economic inequality and cultural atomization. Whatever communitarian spirit may have once glued the country together, the collage has been coming unstuck for a while now. Sustained growth over first two or three decades following World War II made it seem at least possible that 21st-century American citizens would take stability, security and opportunity as birthrights. Economic crises would be the stuff of history lectures. The biggest problem would be managing all our free time.
The sense of having gone off course somehow runs deep. Yet we have largely lost any language for framing an alternative. The notion of the general welfare has grown quaint, if not suspect. The individual self is engaged in a zero-sum game with the rest of the world; for anything to count as a good, it must have the potential to generate invidious comparisons. “Each [of us] needing to affirm his or her own value,” says James, “we devolve into a destructive contest for rank and superiority.”
We live, it seems, in an asshole oligarchy. Nobody thinks of Trump as an exception. But he is the one guy saying -- over and over, between the insult tweets and explosive ranting -- that the status quo is bad, folks, you have no idea how bad, trust me. The whole thing must be put into bankruptcy, after which he’ll negotiate a new social contract for us. What have you got to lose?
James is under no illusions about the candidate’s sincerity, competence, self-control or emotional stability. He calls Trump’s campaign rallies “the modern version of executions for public entertainment; it’s the dynamics of crowds and power that, with the help of technology, made the 20th century the bloodiest in human history.” So, not a endorsement. The idea that putting Trump in office represents a “strategy of asshole management … a last-ditch effort at taming a corrupt political system” can be explained rationally. That doesn’t make it a rational idea, though, and patience with the thought experiment will probably decrease as election draws closer.
Whatever Trump’s candidacy may reveal about the state of the social fabric, he’s torn a few more holes in it already. James quotes a line from Rousseau that arguably sums up the spirit of his book: “The manner in which public affairs are conducted gives a sufficiently accurate indication of the moral character and state of health of the body politic.” The implications of that sentence are almost as horrifying as the thought of Donald Trump with the nuclear launch codes.
The brief summer respite from controversies surrounding free speech on campus ended last week when the University of Chicago sent a letter to incoming students affirming its bedrock commitment to academic freedom, while decrying trigger warnings, “safe spaces” and censorship. The letter went viral, prompting impassioned responses ranging from full-throated endorsements to charges that it reeked of “arrogance, of a sense of entitlement [and] of an exclusionary mindset.”
Get ready for another contentious academic year on the free speech front. A recent Gallup poll concluded that college students support First Amendment rights “in the abstract” but “many are also comfortable shuttering free speech and impeding a free press” in order to restrict “offensive or biased speech.” Taking the measure of campus debates about free expression from this past academic year, the survey results provide additional evidence that real issues are at stake beyond the scorching, end-is-nigh headlines such as “The Death of Free Speech on College Campuses.”
Increasing skepticism about the importance of free expression is turning a significant -- and vocal -- contingent of students into cynics who regard free speech as nothing more than a weapon of the rich, the powerful and the privileged. That trend poses a threat to the development of robust critical thinking skills as well as to the health and vitality of participatory democracy.
For those students who imagine that First Amendment rights are monopolized by the “entitled,” free speech is seen as little more than a license to offend and oppress historically marginalized groups, especially people of color. If this sounds fanciful, you haven’t spent enough time reading through the editorial pages of college newspapers from the past few years. Here is a representative excerpt from a March 2016 op-ed from the Bates College newspaper: "Advocating for unlimited free speech privileges a certain group of people who already have the opportunity for their voices to be heard. It advocates for unlimited acts of violence and aggression towards marginalized people with little to no consequence. For this reason, it is hard for me not to argue for the censorship of what we say, to ensure that marginalized people have a verbal space to inhabit safely in public, as it is obvious that they do not always have safe physical spaces to inhabit in this country."
On the issue of censorship, the author has lots of company. The “Free Expression” Gallup poll reported that more than two out of three students say colleges should be allowed to “establish policies that restrict slurs and other language that is intentionally offensive to certain groups.” Setting aside epithets, more than one in four say colleges should be able to restrict speech “expressing political views that are upsetting or offensive to certain groups.”
Those numbers signal that many students are suspicious of -- or even downright reject -- the premise that the best antidote to offensive speech is always more speech, an idea that has long been a basic tenet of free expression. Among these students, the survey revealed, are men and women, whites and blacks as well as Democrats, Republicans and Independents; so let’s put to rest the charge leveled by some on the right that it’s just those “pesky” women, minorities and bleeding hearts who are calling into question First Amendment rights.
The Gallup survey also investigated the extent to which students feel comfortable articulating their opinions. Fifty-four percent of the students say the “climate on my campus prevents some people from saying things they believe because others might find them offensive.” Senior administrators and faculty members bear some responsibility for this troubling state of affairs. According to a report from the Foundation for Individual Rights in Education (FIRE), more than half of colleges and universities have restrictive speech codes -- that is, “policies prohibiting student and faculty speech that would, outside the bounds of campus, be protected by the First Amendment.”
In addition, more than 100 colleges and universities (and counting) have Bias Response Teams, which are tasked with investigating and responding to complaints about so-called “bias incidents.” At Syracuse University, “name calling,” “avoiding or excluding others” and “making comments on social media about someone’s political affiliations/beliefs” are all potential instances of bias. In principle and practice, Bias Response Teams communicate to students that “no incident is too small to report.”
Regarding faculty members, under the powerful influence of the “linguistic turn,” we scholars in the humanities -- and occasionally those in the social sciences -- have been banging on for decades about the awesome power of language (or discourse, in its formal dress), outlining in exquisite detail the ways in which it may serve to coerce, subjugate and oppress. In this kind of environment where speech is oftentimes regulated and the capacity of words to inflict damage is frequently underscored, it’s no wonder that some students of all backgrounds are in favor of eliminating speech that might insult or offend.
Unless I’m gravely mistaken, the overwhelming majority of students who are afraid to share their ideas, opinions and beliefs are not closeted bigots. Even so, they are understandably reluctant to have frank conversations -- in classrooms and in proverbial late-night bull sessions -- about questions that might veer into controversial territory. Questions like: Is sexual orientation hard-wired or a personal choice? How do you tell the difference between cultural mixture and cultural appropriation? And is the Black Lives Matter movement achieving its objectives?
It was, in fact, an earnest attempt to reckon with the last question that sent Wesleyan University into a tailspin last September when a student named Bryan Stascavage wrote an op-ed in the campus newspaper challenging some of the rhetoric and tactics associated with Black Lives Matter. Judge for yourself, but the Washington Post seemed to get it right when the paper said his analysis was “no more radical than the conservative commentary you might see on mainstream op-ed pages” in national papers. Many Wesleyan students, however, were deeply offended by the piece -- in the midst of a campus uproar about the “frustration, anger, pain and fear that members of the student body felt in response to the op-ed,” stacks of the paper were stuffed into recycling bins, the student government slashed the newspaper budget in half and Stascavage was tarred a “racist.”
The calamity at Wesleyan set the tone for an academic year filled with troubling incidents of campus censorship and threats to free speech, including growing concerns about the “tension between academic freedom and Title IX enforcement,” an activist push to restrict or ban news-media coverage of student protests at several colleges and universities, and numerous attempts -- some successful, some not -- to disinvite “objectionable” speakers from coming to campus. Calls for trigger warnings about “disturbing” classroom content continued to proliferate, to the point where some students are now requesting trigger warnings -- or even alternative assignments -- for readings about the Holocaust.
The academic year came to a fitting end with the ACLU filing a lawsuit against the University of California at San Diego in May to “enforce core First Amendment rules against targeting the press.” Earlier in the year, the student government, aided and abetted by administrators, cut funding for all student media in an attempt to shut down the Koala, a raunchy and irreverent satirical paper in the tradition of the Harvard Lampoon and the Onion. The precipitating event was a column called “UCSD Unveils New Dangerous Space on Campus,” which ridiculed trigger warnings and safe spaces. In numerous complaints submitted to the UCSD Bias Response Team about the paper’s “sexist and racist comments masked under cruel humor,” students called for “an end” to the Koala or, at the very least, a system for “administrative approval of the content.”
Beyond the campus green, you cannot just shut down the presses when confronted by speech that offends you. “In a democracy,” the late philosopher Ronald Dworkin wrote in the wake of the 2005 Danish cartoon controversy, “no one, however powerful or impotent, can have a right not to be insulted or offended.” It’s not unreasonable to expect that a reluctance to engage with “distasteful” or "scary" ideas will render students defenseless when they step into the sometimes rough-and-tumble civic arena after they graduate. On too many campuses, widely held political positions that aren’t “progressive” -- such as being pro-life or against gun control-- are summarily dismissed as intolerable.
“Part of being an American is the obligation to listen to language that makes you uncomfortable,” criminal lawyer and staunch First Amendment champion David Baugh said in a recent interview. “If you’re going to be a citizen, if you’re going to speak freely, you have to be able to tolerate bad ideas.” For Baugh -- who is African American, the son of a Tuskegee fighter pilot -- his declarations on tolerance are not sanctimonious abstractions. Working with the ACLU in the late 1990s, Baugh volunteered to help defend Barry Elton Black, an imperial wizard in the Ku Klux Klan who had been arrested for cross burning, an outcome that Baugh considered a violation of Black’s First Amendment rights. (The Supreme Court, in its 2003 Virginia v. Black decision, agreed.)
Elaborating on how to address “bad ideas” such as racial supremacy, Baugh explained, “In a true free society, every idea has to be discussed if for no other reason than saying that’s a stupid, damn idea, we ought to throw it away.” This is a more colloquial version of the phrase from Yale University’s 1975 Woodward Report that intellectual growth and discovery require the freedom to “think the unthinkable, discuss the unmentionable, and challenge the unchallengeable.” In an educational setting, playing devil’s advocate to consider unpopular or minority positions is an indispensable teaching tool. Beyond the classroom, tolerance for ideas we find misguided or repellent does not mean, as many students appear to believe, that we condone them. It’s possible to condemn ideas, broadcasting our misgivings to the high heavens, without censoring them.
Critical Thinking and Citizenship Rights
If colleges and universities shrink from engaging with materials students find too sensitive, controversial or offensive, the growth of their critical thinking skills will be severely stunted. We already have a tendency to misrepresent ideas that we disagree with. And that’s when we actually expose ourselves to them. Only 16 percent of college students say Americans do a good job at “seeking out and listening to differing viewpoints from their own.” A “just say no” approach to “objectionable” materials will turn us into intellectual sloths. Without the stimulation to interrogate our basic assumptions or to consider alternatives to our preferred explanations, our own ideas will devolve into pathetic caricatures. If you are in favor of affirmative action, for instance, how sophisticated can your position really be if you refuse to engage with the claims and evidence advanced by its critics?
Scholars on the left rightfully challenge our most cherished national ideals such as “freedom,” “equality” and “opportunity,” showing how they have not applied to far too many groups of people based on their race, national origin or gender. (Women, for example, have been eligible to cast votes in less than half of our presidential contests.) Since the election of Barack Obama, the idea that we live in a post-racial society has been subjected to withering criticism from left-leaning academics. Supported by a raft of empirical data, professors like legal scholar Michelle Alexander of The New Jim Crow fame cogently argue that the notion of a post-racial United States is an illusion, a self-congratulatory lie we tell ourselves in order to justify gross social and economic inequalities. As Princeton University professor of religion and African American Studies Eddie Glaude Jr. acidly observes, “We have a black man in the White House and nearly one million black men and women in the Big House.”
The left’s attention to power dynamics and structural inequalities sometimes becomes a fixation. Consider the response of the University of Minnesota’s Council of Graduate Students to a draft faculty statement on free speech released this past March. “People found [the text] offensive,” a spokesperson for the Council of Graduate Students reported. In a letter objecting to the “Four Core Principles” document, the executive committee of the Council called some of the language used “tone-deaf” and “ill-advised,” dismissing as “deplorably patronizing” the proposition that “the most effective response to offensive ideas is to rebut them with better ideas.” It also roundly rejected the principle that “free speech cannot be regulated on the ground that some speakers are thought to have more power or more access to the mediums of speech than others.” Instead, the graduate student group argued, the university should give “special consideration to otherwise marginalized speakers,” a kind of affirmative action for speech that would provide distinct forums for “those who are not well-spoken or who use English as a second language.” (Whatever you think of this idea, it would be a logistical nightmare to implement.)
It would be unpardonably naïve for free speech proponents to ignore the fact that some voices -- the rich, the powerful, the white, the male -- have been amplified, while others have been tuned out or muted. Even in the age of social media where the public square is a click away, we need to be mindful that the speech of some individuals, rightly or wrongly, has more currency in the marketplace of ideas. Nonetheless, this marketplace is teeming and vibrant, energized by a multitude of different points-of-view. As I write, the book that has been on The New York Times hardcover non-fiction list the longest is Ta-Nehisi Coates’ Between the World and Me, a text that argues, “‘White America’ is a syndicate arrayed to protect its exclusive power to dominate and control [black] bodies.”
It would also be disingenuous for strong advocates of free expression to dismiss the charge that ringing the free speech alarm bell sometimes serve as an excuse to malign student protesters and deflect attention from pressing conversations about racism or other social problems. This “diversion” thesis is not without merit -- just take a few minutes to skim through the contemptuous stories about campus activism on websites such as The College Fix, the Daily Caller or Heat Street.
But free speech, we are obliged to acknowledge, has been at the heart of every single successful movement devoted to expanding citizenship rights and enlarging the charmed circle of “we the people,” from abolitionism and woman’s suffrage to marriage equality. So it’s especially ironic that, while some students compare unfettered free speech to lynching, today’s most energetic social movement is devoted to ending violence against black people. Emerging from a hashtag, Black Lives Matter would not be a household name or a substantial political force without First Amendment rights, including the freedom of speech, the press and assembly.
Rue the day that “free speech” starts to appear more regularly in scare quotes. If we encourage the same kind of sneering disdain for free speech that some reserve for ideas like colorblindness, meritocracy and the “American dream,” we will be in deep trouble. Our democracy will be impoverished and so too will our minds. Taking any kind of stand that undercuts free speech is like launching a vendetta on the air we breathe. If it’s successful, we will all suffocate.
Jeffrey Aaron Snyder is an assistant professor in the department of educational studies at Carleton College.
In late spring, I had to endorse a number of legal documents using a digital rendering of my name in a cursive script, chosen from a a menu of simulated handwriting styles. It was like my signature, except legible. Beneath its surface, so to speak, was my identifying data -- confirmed by what the company producing the application calls “robust authentication methods.”
Indeed, if it were necessary to prove the authenticity of the “signature” in court, there is no question that the digital glyph could be verified rigorously, whereas a handwriting expert would be hard-pressed to find much uniformity between versions of the scrawl that I make on paper. The visible part of an electronic signature, with its imitation of penmanship, is just a formality. It accommodates our lingering sense that entering a binding legal obligation really ought to include the act of affirming one’s identity by one’s own hand. (With the whole hand, that is, not just the finger used to click “I accept.”) At this stage, the feeling is vestigial. Given another generation or two of kids who grow up knowing how to type before they can ride a bicycle, it could disappear entirely, and the practice of writing by hand could become an antiquarian hobby, like churning your own butter or making horseshoes.
But not necessarily. Anne Trubek covers a great deal of interesting ground in The History and Uncertain Future of Handwriting (Bloomsbury), though much of the comment it has received concerns the "uncertain future" part, in an elegaic mode. I found rather that many of the book's points are made most clearly at the start, when she discusses the earliest known system of writing, Sumerian cuneiform, which emerged around 3000 B.C.E. (Formerly an associate professor of rhetoric, composition and English at Oberlin College, Trubek is the founder of Belt magazine, devoted to urban life in the Rust Belt.)
Encouraged by a curator to pick up some cuneiform tablets for a closer look, Trubek is struck by how compact they are: roughly palm-sized, covered with tiny marks made on wet clay with a stylus. While the writing instruments were simple, mastering them was not: “The tip of the stylus was triangular, and turning it caused different slants and directions for the marks -- some went deeper than others , some went horizontally or vertically, and the bottom of the stylus would make a round mark -- each with a distinct meaning.” To develop competence required years of study at the edubba (the Sumerian word for school: literally, “tablet house”) and also, one assumes, regular offerings to Nabu, the god of scribes, wisdom and literature.
“By 1600 B.C.E.,” writes Trubek, “no Sumerian speakers were alive,” but it continued to be taught as a classical language, while cuneiform remained in use for another thousand years. For all its difficulty, cuneiform was easier to learn than the Egyptian writing system; it was relatively utilitarian (often used for business contracts and tax records), while hieroglyphics “were as much an art form as they were a means of information storage.” And clay tablets “continued to be used even after most people had shifted to papyrus.”
It is a lesson in the durable habits of the late adopter -- and a reminder that “the uncertain future of handwriting” (or of any other aspect of literacy) will not be decided by the automatic workings of progress and obsolescence.
As she roams across the centuries, Trubek points out two or three factors that have largely set the terms for the development of handwriting, quite apart from the qualities of the tools we use. One, of course, is that only a small a fraction of the population has been able to acquire the skill throughout most of history. In addition, people who could read did not always learn to write, as the skill was difficult and slow to master. Finally, people in authority have long tried to impose norms on how words should appear on the page. Charlemagne in the ninth century might not have otherwise have much resembled American schoolteachers in the 20th century, but they shared a passion for standardized penmanship.
It’s easy to see how these three difficult realities -- widespread illiteracy, tortuous pedagogy, and the demand for uniformity -- would tend to reinforce one another. Trubek’s story is in part one of rapid changes followed by stubborn inertia. One side-effect of Gutenberg’s invention of the printing press was that, in putting scribes out of work, many were compelled to take up a new career: they became writing instructors. That can only have encouraged more efficient pedagogy -- and with it a broadening and deepening of the pool of those people able to write, as well as read, more books.
But the drive to standardize and to reinforce social hierarchy seems, if anything, to have intensified: handwriting became an index of status, rank and moral uprightness. A 17th-century writing manual recommended a particular script for women, “for as much as they (having not the patience to take any great paines, besides phantasticiall and humorsome) must be taught that which they may instantly learn.” Immigrants who practiced the Palmer Method -- one of the predominant forms of handwriting instruction available a hundred years ago -- would be more readily assimilated under its “powerful hygienic effect.” Good penmanship — for a purer America!
The invention of the typewriter was one giant leap for standardization, and Trubek quotes one researcher’s conclusion that “familiarity with the typewriter makes students better penmen, not worse.” But it also provoked worries that teachers were neglecting handwriting instruction. In time, “the universal typewriter may swallow all.”
It didn’t, of course. What actually happened (and this is one of the most striking points in a book full of them) was that another attitude towards handwriting came into focus: a sense of it being like one’s fingerprints, with distinct qualities visible only to the trained eye. Graphologists make the still larger claim that handwriting analysis can reveal aspects of the personality. Because graphology ranks somewhere between dowsing and astrology in scientific rigor, I had assumed its origins were lost in the mists of antiquity. But it turns out the idea took shape no later than the 17th century, with efforts to systematize it really getting underway only in the 19th century, amidst worries about individuality being crushed by the march of progress.
Trubek’s history of handwriting is a story of metamorphosis, not of decline. Given my experience with “signing” digital documents a few months ago, I was interested and amused to learn it was a variation on a theme: A century or so back, one manufacturer “marketed -- unsuccessfully, it seems -- a typewriter whose letter keys were formed from handwriting of the buyer.” If the future of handwriting is uncertain, that’s in part because no one can tell what uses and meanings we may find for it yet.