Humanities

Essay on working 40 hours a week as an academic

It's possible to be a successful academic without working more than 40 hours a week, writes Trish Roberts-Miller.

Job Tags: 
Ad keywords: 
Editorial Tags: 
Show on Jobs site: 

Essay on technology issues facing students and faculty members

Regular readers of the higher education press have had occasion to learn a great deal about digital developments and online initiatives in higher education. We have heard both about and from those for whom this world is still terra relatively incognita. And, increasingly, we are hearing both about and from those commonly considered to be to be “digital natives” –- the term “native” conveying the idea of their either having been born to the culture in question or being so adapted to it that they might as well have been.

When we think of digital natives, we tend to think of students. But lest we think that things are easy for them, let us bear in mind their problems. Notably, they share the general difficulty of reputation management or what we might consider the adverse consequences of throwing privacy away with both hands when communicating on the internet. More to the point in the world of higher education, many suffer from the unequal distribution of online skills most relevant to academic success –- yet another factor in the extreme socioeconomic inequality that afflicts our nation’s system of higher education.

But let us turn our attention to the faculty, and first to those relatively unschooled in new information technologies. At the extreme, there are those who view the whole business with fear and loathing. We must find ways to persuade them that such an attitude is unworthy of anyone who has chosen education as a vocation and that they would do well to investigate this new world with an explorer’s eye –- not uncritically, to be sure, given the hype surrounding it –- in order to reach informed positions about both the virtues and the limitations of new information technologies.

Others are more receptive, but also rather lost. They are fine with what Jose Bowen calls “teaching naked” (i.e., keeping technology out of the classroom itself), since they have been doing it all their working lives, but are unable to manage the other major part of the program (that is, selecting items to hang in a virtual closet for their students to try on and wear to good effect, so that they come to class well-prepared to make the most of the time together with one another and their instructor). What these faculty members need is the right kind of support: relevant, well-timed, and pedagogically effective –- something far less widely available than it should be.

Digitally adept faculty have challenges of their own, some of which are old problems in new forms. There is, for example, the question of how available to be to their students, which has taken on a new dimension in an age in which channels of communication proliferate and constant connectedness is expected.

And then there is the question of how much of themselves faculty members should reveal to students. How much of their non-academic activities or thoughts should they share by not blocking access online or perhaps even by adding students to some groups otherwise composed of friends?

Many of us have worked with students on civic or political projects –- though not, one hopes, simply imposing our own views upon them. Many of us have already extended our relationship into more personal areas when students have come to us with problems or crises of one sort or another and we have played the role of caring, older adviser. We have enjoyed relatively casual lunches, dinners, kaffeeklatsches with them that have included discussion of a variety of topics, from tastes in food to anecdotes about beloved pets. The question for digital natives goes beyond these kinds of interaction: To what extent should students be allowed in on the channels and kinds of communications that are regularly –- in some cases, relentlessly and obsessively –- shared with friends?

Not all of this, to be sure, is under a faculty member’s control. Possibilities for what sociologists call “role segregation” hinge on an ability to keep the audiences for different roles apart from one another –- hardly something to be counted on in these digital times. But leaving aside the question of how much online information can be kept from students, how much of it should be kept from them?

Will students be better-served, as some faculty members seem to believe, if they see ongoing evidence that their teachers are people with full lives aside from their faculty roles? Should students be recipients of the kinds of texts and tweets that faculty members may be in the habit of sending to friends about movies, shopping, etc.? Given how distracting and boring some of this may be even to friends, one might well wonder. Some students will perhaps get a thrill out of being in a professor’s “loop” on such matters, but do we need to further clutter their lives with trivia? This is an area in which they hardly need additional help.

To put this issue in a wider context: In her 1970 book Culture and Commitment, anthropologist Margaret Mead drew a distinction among three different types of culture: “postfigurative”, in which the young learn from those who have come before; “cofigurative”, in which both adults and children learn a significant amount from their peers; and “prefigurative”, in which adults are in the position of needing to learn much from their children. Not surprisingly, Mead saw us as heading in a clearly prefigurative direction –- and that years before the era of parents and grandparents sitting helplessly in front of computer screens waiting for a little child to lead them.

Without adopting Mead’s specific views on these cultural types, we can find her categories an invitation to thinking about the teaching and learning relationship among the generations. For example, should we just happily leap into prefigurativeness? 

Or, to put it in old colonialist terms, should we “go native”? Colonial types saw this as a danger, a giving up of the responsibilities of civilization –- not unlike the way the Internet-phobic see embracing the online world. The repentant colonizers who did decide to “go native”, motivated either by escapism or by a profound love and respect for those they lived and worked with, sometimes ended up with views as limited by their adopted culture (what is called “secondary ethnocentrism”) as were limited by their original one.  This aside from the fact that attempts to go native are not always successful and may even seem ridiculous to the real folks.

Perhaps it is helpful to think of ourselves first as anthropologists. We certainly need to understand the world in which we ply our trade, not only so that we can do our work, but also because we are generally possessed of intellectual curiosity and have chosen our vocation because we like working in a community. We believe that we have much to learn from the people we study and, at the same time, know that we can see at least some things more clearly because we have the eyes of outsiders.

But we are also missionaries, since we feel we have something of value to share –- to share, to be sure, not simply to impose. What might that something be?

In the most basic sense, it is the ability to focus, to pay attention, take time to learn, looking back at least as often as looking forward. Most of our students live in a noisy world of ongoing virtual connectedness, relentless activity, nonstop polytasking (how tired are we of the word “multitasking”?). Like the rest of us, they suffer from the fact that too much information is the equivalent of too little. Like the rest of us, they live in a world in which innovation is not simply admired, but fetishized.

So, even as we avail ourselves of the educational benefits of new information technologies, we might think of complementing this with a Slow Teaching movement, not unlike the Slow Food movement founded by Carlo Petrini in 1986 with the goal of preserving all that was delicious and nutritious in traditional cuisine.  We have such traditions to share with our students even as we become more knowledgeable about the world in which they move.

Our students and junior colleagues don’t need us to be them; they need us to be us. Or, as Oscar Wilde so engagingly put it: Be yourself; everyone else is already taken.

Judith Shapiro is president of the Teagle Foundation and former president of Barnard College.

Section: 
Editorial Tags: 

Essay on using or ignoring teaching innovations

Category: 

Just because a teaching idea is hot doesn't mean you need to embrace it, writes Rob Weir.

Job Tags: 
Editorial Tags: 
Show on Jobs site: 

Colleges award tenure

Smart Title: 

The following individuals have recently been awarded tenure by their colleges and universities:

Bloomsburg University of Pennsylvania

The humanities strengthen the study of science (essay)

“Would you like to see the brain collection?” my guide asked, as we finished our tour of the Yale School of Medicine. What scientist could resist?

I was expecting an impersonal chamber crammed with specimens and devices. Perhaps a brightly lit, crowded, antiseptic room, like the research bays we had just been exploring. Or an old-fashioned version, resembling an untidy apothecary’s shop packed with mysterious jars. 

But when we entered the Cushing Center in the sub-basement of the Medical Library, it was a dim, hushed space that led through a narrow opening into an expansive area for exploration and quiet reflection. As my guide noted, it looked remarkably like a posh jewelry store, with lovely wooden counters, closed cabinets below and glass-enclosed displays above. 

And such displays! Where I had envisioned an imposing, sterile wall of containers, with disembodied brains floating intact in preservative fluid, there was instead a long sinuous shelf of jars just above eye level, winding around the room. Each brain lay in thick slices at the bottom of its square glass container, the original owner’s name and dates on a handwritten label. Muted light glinting off the jars, and lending a slight glow to the sepia-toned fluid within, gave the impression of a vast collection of amber. 

In frames leaning from countertop to wall or resting in a glass-topped enclosure set within the counter were collages of photos and drawings. Surprised, I stepped closer, glimpsed human faces, and found extraordinary science therein.

I had anticipated spectacle: materials displayed in a manner that entertains, yet distances the audience and makes what is viewed seem exotic and alien. Instead, I experienced science in its most human manifestation: specimens arranged to emphasize the reason they were of interest to their original owners, those who had studied them, and those now viewing them.

A typical collage showed photographs of an individual living human being alongside Cushing’s exquisite drawings of the person’s brain, as dissected during surgery or after death. The photographs were posed to show the whole person as a unique individual – and also, in many cases, revealed the presence of the brain tumor they were then living with, through the shape of the skull or as a lump beneath the skin. The drawings revealed the location and anatomical details of the tumor. The very brain that had animated the person and suffered the tumor reposed in its jar nearby.

One could not walk away unmoved.

On the personal level, I was reminded of various individuals I have known whose deaths were caused by brain tumors. The first, decades ago: an admired college mentor. The most recent pair, within the last year: the vivacious wife of one colleague, the young child of another. I remember them as people who enriched others’ lives with their grace and strength of character and I am grateful for the medical advances that gave them extra time to be part of their families and communities.

As a scientist, I was reminded viscerally that this is exactly what we mean when we say all science exists within a human context. Cushing’s work, memorialized so effectively in this small museum, began at a time when neurosurgery was crude and ineffectual, and hope for those with brain tumors was practically nonexistent. By his career’s end, he had introduced diagnostic and surgical techniques that lowered the surgical mortality rate for his patients to an unheard-of ten percent, a rate nearly four times better than others achieved.

The human patients on whom Cushing operated were everything to him, simultaneously providing motivation, subject, object, and methods for his research. In endeavoring to find cures for their conditions, he studied their lives and symptoms, operated on and sketched their tumors, and used what he learned from each case to improve his effectiveness. The purely scientific aspect of his work (advancing the surgical treatment of brain tumors) was inextricably linked with its humanistic aspects (understanding the histories and fates of the individual members of his clinical practice). Indeed, it was his methodical linking of the clinical and human sides of medicine that made his contributions of such lasting significance. Cushing himself stressed that “a physician is obligated to consider more than a diseased organ, more than even the whole man – he must view the man in his world.” 

Seen in this light, the juxtaposition of images inside the museum’s frames carries dual meanings. 

First, the combined images document the course of medical history, forming what the biographer Aaron Cohen-Gadol calls “the diary of neurological surgery in its infancy.” The very format of these still photographs, hand-drawn sketches, and carefully stained glass slides reminds us that Cushing worked in an era before radiological methods for brain imaging and, initially, an era when even still photography was rather cumbersome. Indeed, his own artistic talent and training was crucial for accurately recording the outcomes of his surgeries. The contents of the images capture the conditions of patients when they came to see Cushing, the treatment, and the aftermath. Collectively, they show how neurology and neurosurgery were practiced in Cushing’s day and how these fields evolved year by year throughout his career.

Second, the combined images directly influenced the course of medical history. Cushing deliberately correlated, through the information in the photographs, anatomical sketches, and medical records, the external indicators of otherwise hidden medical problems within the skull. This led to improvements not only in how neurosurgeons operated but also in how readily other doctors could recognize early external indications of brain tumors and send patients for prompt treatment. As Cushing’s biographers note, “Each patient is of historical significance now because our discipline of neurological surgery evolved through his or her care.” Moreover, because he trained a generation of neurosurgeons in these methods, Cushing helped ensure the continuing development of the field; a number of these junior colleagues, in turn, were instrumental in the creation of the museum that now makes the images publicly visible.

The juxtaposition of Cushing’s images therefore represents the very essence of how the humanities and sciences are intertwined: achieving his medical breakthroughs depended directly on his active depiction and analysis of human experience. 

As an educator, I find that the displays in the Cushing Center encapsulate why young scientists need to study their fields in historical and social context. Isolated technical proficiency would not have enabled Cushing to become the originator of modern neurosurgery; his intense focus on the human condition was essential. Indeed, Cushing mused in a letter to a fellow physician that he “would like to see the day when somebody would be appointed surgeon somewhere who had no hands, for the operative part is the least part of the work.” Similarly, to fully prepare for careers in science, it is essential that students grasp how the impetus for scientific work arises from the world in which the scientist lives, often responds to problems the scientist has personally encountered, and ultimately impacts that society and those problems in its turn. 

A very few scientists may be largely self-taught and spend their entire careers working on abstract problems in isolated research institutes without ever teaching a course, writing a grant or giving a public lecture. Even they, however, are influenced in their selection of research problems by the results that other individuals have previously obtained. And even they must communicate their results to other people in order to impact their field. Most of us interact far more directly with other people in our scientific endeavors:  they inspire our choices of major or thesis topic, pay taxes that support grants for our facilities and students, run companies that underwrite our applied investigations, propose legislation that regulates how we share data and maintain lab safety.

Some might argue that these considerations apply mainly to the life sciences, where the human connections are most tangible. They might think, for instance, that my own work as a theoretical physicist is too abstract to be influenced by societal context. After all, the field-theoretic equations I manipulate have no more race or gender or politics than the subatomic particles they describe. Yet my choice of research questions has unquestionably been affected by the contingent historical details of my own professional life: the compelling lectures that enticed me to switch fields during graduate school, the inspiring discussions with my doctoral adviser that established symmetry as a guiding principle, the discovery of certain subatomic particles at the start of my career and the decades-delayed confirmations of others. My sense of how science operates on both philosophical and practical levels has also unmistakably been influenced by my long-ago experiences as a graduate teaching assistant for History of Science courses and my ongoing conversations with scholars in Science Studies.

This is why programs that deliberately train scientists in the humanities are so essential to educating scientists effectively.  Every nascent scientist should read, think, and write about how science and society have impacted one another across cultural and temporal contexts. Not all undergraduates will immediately appreciate the value of this approach. The first-year students in my own college have been known to express confusion about why they must take that first course in the history, philosophy, and sociology of science. But decades later, our alumni cite the “HPS” curriculum as having had a profound impact on their careers in science or medicine. They remember the faculty members who taught those courses vividly and by name. They tell me the ethical concepts absorbed in those courses have helped them hew more closely to the scientific ideal of seeking the truth.

In the wake of C.P. Snow’s famous Rede Lecture on the Two Cultures of the sciences and humanities, academic programs were founded in the late 1960s and early 1970s (e.g., Michigan State University’s Lyman Briggs College and Stanford University’s Science, Society, and Technology program) with the express aim of immersing students in the deep connections between science and society. Decades later, those programs are thriving – and the impact of the ideas they espouse may be seen in changes that pre-professional programs in medicine and engineering have been embracing.

For example, the newest version of the Medical School Admissions Test (MCAT2015) incorporates questions on the psychological, social, and biological determinants of behavior to ensure that admitted medical students are prepared to study the sociocultural and behavioral aspects of health. Similarly, in 2000, ABET modified its criteria to emphasize communication, teamwork, ethical professional issues and the societal and global context of engineering decisions. An evaluation in 2002 found a measurable positive impact on what students learned and their preparation to enter the workforce.  

While pre-medical and engineering students are being required to learn about issues linking science and culture, most students in science fields are still not pushed to learn about the human context of their major disciplines. We faculty in the natural sciences have the power to change this. Many of us already incorporate “real world” applications of key topics in our class sessions or assignments; introductory textbooks often do likewise. But we can extend this principle beyond the classroom into the world of intellectual discourse and practice. As colloquium chairs and science club mentors, we can arrange regular departmental talks on topics that stress the interdependence of science and society: STEM education, alternative energy, medical technology, gender and science. 

As academic advisers we can nudge science students towards humanities courses that analyze scientific practice or towards summer internships with companies and NGOs as well as traditional REU programs. As directors of undergraduate or graduate studies, we can highlight science studies topics, interdisciplinary organizations, and non-academic career paths on the department website. Making these connections part of the life of the department can better prepare our students for their futures as capable scientists responsible to and living within society.

In the end, Cushing’s brain collection vividly reminds us why it is crucial to immerse natural science students in interdisciplinary science studies that incorporate the social sciences and humanities. It is not merely because hot new fields are said to lie at the unexplored intersections of fields whose borders were arbitrarily codified decades or centuries ago (though that is true).  It is not merely because the terms interdisciplinary, cross-disciplinary, and trans-disciplinary are presently in vogue (though that is also true).  It is because such cross-training produces scientists who are both more capable of extraordinary breakthroughs and more mindful of their broader impacts. The humanities truly strengthen science.

Elizabeth H. Simmons is dean of Lyman Briggs College, acting dean of the College of Arts and Letters, and University Distinguished Professor of Physics at Michigan State University.

Editorial Tags: 

How technology can help save the liberal arts (essay)

A rash of articles proclaiming the death of the humanities has been dominating the higher education press for the last couple years. Whether it’s The New York Times, The New Republic or The Atlantic, the core narrative seems to be that liberal arts education will be disrupted by technology, it’s just a question of time, and resistance is futile.  But I am convinced that not only is the “death of the humanities at the hands of technology” being wildly exaggerated, it’s directionally wrong.

This month on Inside Higher Ed, William Major wrote an essay, “Close the Business Schools/Save the Humanities”.  I loved it for its provocative frame, and because I’m a strong proponent of the humanities. But it positioned business and humanities as an either/or proposition, and it doesn’t have to be so.  

If John Adams were alive today, he might revise his famous quote:

I [will start with the] study politics and war... then mathematics and philosophy… [then] natural history and naval architecture, navigation, commerce and agriculture [in order to give myself a right] to study painting, poetry, music.

What would take generations in Adams’s day can be done in a single lifetime today because of technology.

Full disclosure: I was Clay Christensen’s research assistant at Harvard Business School, and am now CEO of a Silicon Valley-based technology company that sells a Learning Relationship Management product to schools and companies.

Perhaps the above might be considered three strikes against me in a debate on the humanities -- perhaps I’m already out in the minds of many readers, but I hope not. Please hear me out.

I think that technology will actually enhance liberal arts education, and eventually lead to a renaissance in the humanities, from literature to philosophy, music, history, and rhetoric. Not only will technology improve the learning experience, it will dramatically increase the number of students engaging in liberal education by broadening consumption of the humanities from school-age students alone to a global market of 7 billion people.

It might be overstating the case to say that this will happen, but it can happen if those of us who care about the humanities act to make it so. To do so, we need to accept one hard fact and make two important strategic moves.

The hard fact is that despite its importance, economic value is the wrong way to think about  the liberal arts -- and the sooner we accept that reality, the sooner we can stop arguing for the humanities from a position of weakness and instead move on with a good strategy to save them.

Of course, it should be noted that there is certainly considerable economic value in attending elite and selective colleges, from Colgate to Whittier to Morehouse. The currency of that economic value is the network of alumni, the talent signal that admission to and graduation from such institutions confer, and the friendships formed over years of close association with bright and motivated people. But the economic value accrues regardless of what the people study, whether it is humanities or engineering or business.  

Moreover, the effort to tie the humanities to economic outcomes cheapens the non-economic value of the humanities. Embracing their perceived lack of economic value allows us to be affirmative about the two things that technology can do to save them: (1) supplementing liberal arts with career-focused education and (2) defining the non-economic value of liberal arts so that we can extend its delivery to those who make more vocational choices for college.

Supplementing the liberal arts with career-focused education such as a fifth-year practical master’s degree, micro-credentials, minors and applied experience is critical to their survival. It doesn’t matter whether the supplements are home-grown or built in partnership with companies like Koru or approaches like Udacity’s Nanodegrees.  What matters is that your students see a way both to study what they love and to build a competitive advantage to pursue a meaningful career.  

The right technology can be a major part of conferring that advantage by helping students to figure out their long-term career ambitions, connect with mentors in industry, consume career-oriented content, earn credentials, and do economically valuable work to prove their abilities.

But the true promise of technology to save the liberal arts is precisely its ability to lower the cost of delivery -- and in so doing to allow everyone on earth to partake in a liberal education throughout their lifetime. Students shouldn’t have to choose between philosophy and engineering, music and business, rhetoric and marketing.  And by lowering the costs, you enable increased consumption -- that is the very nature of disruptive innovations.

Given that my education in economics and business leaves me woefully inadequate to the task of defining the non-economic value of liberal arts, I’ll leave that task to John F Kennedy instead, who said:

“[Economic value] does not allow for the health of our children...or the joy of their play. It does not include the beauty of our poetry or the strength of our marriages; the intelligence of our public debate or the integrity of our public officials. It measures neither our wit nor our courage; neither our wisdom nor our learning; neither our compassion nor our devotion to our country; it measures everything, in short, except that which makes life worthwhile.”

It is for those things that do make life worthwhile that the liberal arts must be saved.

Gunnar Counselman is the founder and CEO of Fidelis.

Essay criticizing U. of Illinois for blocking a controversial faculty hire

The dismissal of Steven Salaita by the University of Illinois at Urbana-Champaign, just days before he was scheduled to start teaching classes, is a serious threat to academic freedom because it was based solely upon Salaita’s extramural utterances on Twitter about Israel.

One thing should be clear: Salaita was fired. I’ve been turned down for jobs before, and it never included receiving a job offer, accepting that offer, moving halfway across the country, and being scheduled to teach classes.

This is not the first time the University of Illinois has fired a professor for his extramural utterances. In 1960, the University fired an assistant professor of biology, Leo Koch, because he wrote a letter to the student newspaper in which he denounced “a Christian code of ethics which was already decrepit in the days of Queen Victoria,” attacked the “the widespread crusades against obscenity,” and urged the university to condone sex among mature students.

The AAUP was unified in opposing the lack of due process in Koch’s firing, and censured the University of Illinois. But the AAUP in 1960 was deeply divided about whether extramural utterances should receive the full protection that all citizens are entitled to, or if extramural utterances must meet the standards of “academic responsibility.” Eventually, the AAUP reached a strong consensus: the 1964 Committee A Statement on Extramural Utterances declared: “a faculty member’s expression of opinion as a citizen cannot constitute grounds for dismissal unless it clearly demonstrates the faculty member’s unfitness to serve.”

This is an extremely high standard, and the arguments against Salaita don’t come anywhere close to meeting it. The best that Salaita’s critics can come up with is the belief that Salaita’s pro-Israel students might feel uncomfortable (by that standard, no professor could ever take a public stand on anything), or that criticizing a foreign government makes you guilty of hate speech (which is a slogan, not a category of prohibited speech), or proves you are uncivil (whatever that means), or that swearing on Twitter means you are evil (remember those “crusades against obscenity”).

Now the University of Illinois and Cary Nelson, a longtime faculty member there, a past AAUP president, and now a critic of Salaita, are marking the 50th anniversary of that important statement by trying to take academic freedom backward to a half-century ago, when extramural utterances that offend the public could justify the firing of a professor.

Academic freedom means that scholars are hired, promoted, and fired based upon purely academic criteria, and not for their political opinions. There are not different kinds of academic freedom for hiring and for tenure. Nelson, by adding that consideration of extramural comments is legitimate before a hire, is attempting to draw a line in academic freedom that doesn’t exist, between hiring and promotion decisions.

Of course, requirements are different for tenure denials. Every professor being fired deserves due process and a full explanation for dismissals, and that’s not possible for the hundreds of applicants rejected with every academic job.

But the standards of academic freedom do not change, only the thoroughness of the procedures. If a university president decreed that no socialists could be hired for faculty positions, would Nelson (or anyone else) accept this principle if it only affected hiring decisions? Clearly, academic freedom does not change for hiring decisions; it is simply harder to identify violations that occur in the hiring process.

But the violations of academic freedom in Salaita’s firing are easy to see because he was already hired. The arguments used to justify Salaita’s dismissal do not withstand scrutiny. According to Nelson, "Salaita’s extremist and uncivil views stand alone.” The fact that a professor is deemed more “extreme” than the rest is no basis for dismissal. If it was acceptable for universities to fire the professor with the most “extreme” views on a particular topic, then dozens of faculty could be purged for their political views each year on every college campus.

The AAUP has never endorsed the firing of faculty members on grounds of “incivility.” The only AAUP statement I can find that mentions civility is “On Freedom of Expression and Campus Speech Codes” (1992), which declares, “On a campus that is free and open, no idea can be banned or forbidden.”

Academic decisions, including job offers, must be based upon academic criteria, and not a judgment about an individual’s tweeting decorum. It is also essential that academic decisions are made by qualified academics. Even if civility were a valid consideration in hiring (and it isn’t), the people who would have to consider it are the faculty experts examining the full academic record and qualifications of a professor, not an administrator who chooses to fire a professor based solely upon public disapproval of extramural utterances. It is noteworthy that Nelson expressed support for Salaita’s firing based entirely upon tweets, and without any consideration of Salaita’s entire record of teaching, research, or service.

All the evidence indicates that the firing of Steven Salaita was purely a political decision, not an academic one, and it violates every principle of academic freedom.

John K. Wilson is the co-editor of AcademeBlog.org, editor of Illinois Academe (ilaaup.org), and the author of seven books, including Patriotic Correctness: Academic Freedom and Its Enemies.

Editorial Tags: 

Essay defends University of Illinois decision not to hire Steven Salaita

This month, my campus, the University of Illinois at Urbana-Champaign, was widely expected to welcome Steven Salaita as a new faculty member. He was to be a tenured professor in the American Indian studies program. But a decision not to present the appointment to the Board of Trustees was made by the chancellor. Although I was not involved in the process and did not communicate my views to the administration, I want to say why I believe the decision not to offer him a job was the right one.

Salaita has written credibly on fiction by Arab Americans and is, so I am told, knowledgeable about Native American studies. But Salaita’s national profile — and the basis of his aspirations to being a public intellectual — is entirely based on his polemical interventions in debates over the Arab/Israeli conflict. Those interventions include his 2011 book Israel’s Dead Soul, which I read last year, and his widely quoted and prolific tweeting. Israel’s Dead Soul is published by Temple University Press, so it is part of his academic profile. His tweets cover precisely the same territory. This more public side of his persona would be widely available to his students; indeed his tweets would be better-known to students than his scholarly publications. His inflammatory tweets are already being widely read. I have been following his tweets for some months because I have been writing about the Israeli/Palestinian conflict and co-editing a collection of essays titled The Case Against Academic Boycotts of Israel. I try to follow the work of all prominent pro-boycott leaders, Salaita among them.

Although I find many of his tweets quite loathsome — as well as sophomoric and irresponsible — I would defend without qualification his right to issue most of them. Academic freedom protects him from university reprisals for his extramural speech, unless he appears to be inciting violence, which one retweeted remark that a well-known American reporter wrote a story that “should have ended at the pointy end of a shiv” appears to do. His June 19 response to the kidnapping of three Israeli teenagers — “You may be too refined to say it, but I'm not: I wish all the fucking West Bank settlers would go missing” — also invokes a violent response to the occupation, since "go missing" refers to kidnapping.

But his right to make most of these statements does not mean I would choose to have him as a colleague. His tweets are the sordid underbelly, the more frank and revealing counterpart, to his more extended arguments about Middle Eastern history and the Israeli/Palestinian conflict. They are likely to shape his role on campus when 2015’s Israeli Apartheid Week rolls around. I am told he can be quite charismatic in person, so he may deploy his tweeting rhetoric at public events on campus. Faculty members are well within their rights to evaluate someone as a potential colleague and to consider what contributions a candidate might make to the campus community. It is the whole Salaita package that defines in the end the desirability and appropriateness of offering him a faculty appointment.

I should add that this is not an issue of academic freedom. If Salaita were a faculty member here and he were being sanctioned for his public statements, it would be. But a campus and its faculty members have the right to consider whether, for example, a job candidate’s publications, statements to the press, social media presence, public lectures, teaching profile, and so forth suggest he or she will make a positive contribution to the department, student life, and the community as a whole. Here at Illinois, even the department head who would have appointed Salaita agreed in Inside Higher Ed that “any public statement that someone makes is fair game for consideration.” Had Salaita already signed a contract, then of course he would have to have received full due process, including a full hearing, before his prospective offer could be withdrawn. But my understanding is that he had not received a contract.

Salaita condenses boycott-divestment-sanctions wisdom into a continuing series of sophomoric, bombastic, or anti-Semitic tweets: “UCSCdivest passes. Mark Yudoff nervously twirls his two remaining hairs, puts in an angry call to Janet Napolitano” (May 28, 2014); “10,000 students at USF call for divestment. The university dismisses it out of hand. That’s Israel-style democracy” (May 28, 2014); “Somebody just told me F.W. DeKlerk doesn’t believe Israel is an apartheid state. This is what Zionists have been reduced to” (May 28, 2014); “All of Israel’s hand-wringing about demography leads one to only one reasonable conclusion: Zionists are ineffective lovers” (May 26, 2014); “Universities are filled with faculty and admins whose primary focus is policing criticism of Israel that exceeds their stringent preferences” (May 25, 2014); “‘Israel army’ and ‘moral code’ go together like polar bears and rainforests” (May 25, 2014); “Keep BDS going! The more time Israel spends on it, the fewer resources it can devote to pillaging and plundering” (May 23, 2014); “So, how long will it be before the Israeli government starts dropping white phosphorous on American college campuses?” (May 23, 1014); “Even the most tepid overture to Palestinian humanity can result in Zionist histrionics” (May 21, 2014); “All life is sacred. Unless you’re a Zionist, for whom most life is a mere inconvenience to ethnographic supremacy” (May 20, 2014); “I fully expect the Israeli soldiers who murdered two teens in cold blood to receive a commendation or promotion” (May 20, 2014); “Understand that whenever a Zionist frets about Palestinian violence, it is a projection of his own brute psyche” (May 20, 2014); “I don’t want to hear another damn word about ‘nonviolence.’ Save it for Israel’s child-killing soldiers” (May 19, 2014); “I stopped listening at ‘dialogue’ ” (May 27, 2014). The last example here presumably advises BDS students how interested they should be in conversations with people holding different views.

More recently he has said “if Netanyahu appeared on TV with a necklace made from the teeth of Palestinian children, would anyone be surprised” (July 19, 2014) and “By eagerly conflating Jewishness and Israel, Zionists are partly responsible when people say anti-Semitic shit in response to Israeli terror” (July 18, 2014). The following day he offered a definition: “Zionists: transforming ‘anti-Semitism’ from something horrible into something honorable since 1948” (July 19).

It is remarkable that a senior faculty member chooses to present himself in public this way. Meanwhile, the mix of deadly seriousness, vehemence, and low comedy in this appeal to students is genuinely unsettling. Will Jewish students in his classes feel comfortable after they read “”Let’s cut to the chase: If you’re defending Israel right now you’re an awful human being” (July 8), “Zionist uplift in America: every little Jewish boy and girl can grow up to be the leader of a murderous colonial regime” (July 14), or “No wonder Israel prefers killing Palestinians from the sky. It turns out American college kids aren’t very good at ground combat?” (July 23)? The last of these tweets obviously disparages the two young American volunteers who lost their lives fighting with the Israeli Defense Forces in Gaza. What would he say if the Arab/Israeli conflict were to come up in a class he was teaching on Arab-American fiction? Would he welcome dissent to his views? Would students believe him if he appeared to do so? As Salaita says of his opposition in an accusation better applied to himself, he has found in Twitter “the perfect medium” in which to “dispense slogans in order to validate collective self-righteousness” (May 14, 2014).

While universities need to study all positions on an issue, even the most outrageous ones, I see no good reason to offer a permanent faculty position to someone whose discourse crosses the line into anti-Semitism. I also do not believe this was a political decision. There are many opponents of Israeli policy on the faculty here and many faculty as well who publicly or privately support the boycott movement. If some faculty expressed their view to the chancellor that Salaita’s recent tweets — tweets published long after the search committee made its recommendation — justify not making the appointment, they had a right to do so. I believe this was an academic, not a political, decision.

Were I to have evidence to the contrary, my view would be different. I regret that the decision was not made until the summer, but then many of the most disturbing of Salaita’s tweets did not go online until the summer of 2014, no doubt provoked by events. That is the time frame in which the statements in question were made. That alone made this an exceptional case. I do not think it would have been responsible for the university to have ignored the evolving character of his public profile. For all these reasons I agree that Salaita’s appointment is one that should not have been made.

Cary Nelson served as national president of the American Association of University Professors from 2006 to 2012. He teaches at the University of Illinois at Urbana-Champaign.

Editorial Tags: 

Adjunct leaders talk about long-term strategies

Smart Title: 

Adjunct leaders -- even as they push votes on collective bargaining -- are talking about how to maintain engagement with the rank-and-file well into the future.

 

Essay on the most important advice for a tenure-track professor

When it comes to earning tenure, you need to forget all the fights about who is responsible for higher education's problems and focus on one simple piece of advice, writes Chuck Rybak.

Job Tags: 
Ad keywords: 
Editorial Tags: 
Show on Jobs site: 

Pages

Subscribe to RSS - Humanities
Back to Top