Nearly 700 faculty members at the University of Wisconsin at Madison have signed a letter to the editor of The Milwaukee Journal Sentinel opposing a proposed state law that would bar the use of fetal tissue in research in the state. Wisconsin is one of a number of states where anti-abortion politicians have responded to the recent videos of Planned Parenthood officials by pushing new measures to limit the sale or use of fetal tissue. Many university researchers, however, use fetal tissue.
"We wonder whether legislators have considered the ethical implications of denying current and future patients the benefits of the research that would be blocked by this legislation," the letter says. "The cell lines derived from fetal tissue are commonly used for research in laboratories worldwide. Other tissues and cells, such as those derived from miscarriages, cannot be substituted for this research, despite the claims of the proponents of this ban."
A regional National Labor Relations Board office said Wednesday that adjuncts at Manhattan College may count their union election votes. The ballots have been impounded since 2011, when the Roman Catholic college objected to NLRB jurisdiction over its campus, citing its religious affiliation. The case was pending before the NLRB in Washington until earlier this year, when the board sent the Manhattan adjunct union case and a handful of others involving would-be adjunct unions at religious colleges back to their regional NLRB offices for re-evaluation based on the recent Pacific Lutheran University decision. In that case, the NLRB said that adjuncts who wanted to form a Service Employees International Union-affiliated collective bargaining unit could do so, because their service to the institution was not sufficiently religious in nature to conflict with the National Labor Relations Act giving workers the right to organize.
The Pacific Lutheran decision included criteria by which other adjunct union bids at religious colleges were to be assessed. In her decision regarding Manhattan, Karen P. Fernbach, director of the NLRB’s regional office in New York, said the college “failed to establish that it holds out the petitioned-for adjunct faculty members as performing a specific role in maintaining” its religious educational environment. For example, she said, the college's faculty application materials say there is “no intention on the part of the [governing] board, the administration or the faculty to impose church affiliation and religious observance as a condition for hiring or admission, to set quotas based on religious affiliation, to require loyalty oaths, attendance at religious services, or courses in Catholic theology."
The proposed Manhattan adjunct union is affiliated with New York State United Teachers, which is in turn affiliated with the American Federation of Teachers and the National Education Association. Paul E. Dinter, a visiting professor of religious studies, said that as "an educator, a Catholic and a social justice advocate, I have to be pleased that the NLRB decision supports the clear Catholic moral teaching that workers have a right to organize. All of us who love Manhattan College and its social justice mission are heartened by this fair and long-delayed decision.”
In a statement, Brennan O'Donnell, Manhattan's president, said, “We are disappointed, but not surprised, by the ruling. We continue to assert our position that the NLRB does not have the right to define what constitutes the Catholic identity and mission of the college.” Manhattan has the option to appeal the ruling. The college said in a statement that it's considering how it will respond.
George Orwell opened one of his broadcasts on the BBC in the early 1940s by recounting how he’d learned history in his school days. The past, as his teachers depicted it, was “a sort of long scroll with thick black lines ruled across it at intervals,” he said. “Each of these lines marked the end of what was called a ‘period,’ and you were given to understand that what came afterwards was completely different from what had gone before.”
The thick black lines were like borders between countries that didn’t know one another’s languages. “For instance,” he explained, “in 1499 you were still in the Middle Ages, with knights in plate armour riding at one another with long lances, and then suddenly the clock struck 1500, and you were in something called the Renaissance, and everyone wore ruffs and doublets and was busy robbing treasure ships on the Spanish Main. There was another very thick black line drawn at the year 1700. After that it was the Eighteenth Century, and people suddenly stopped being Cavaliers and Roundheads and became extra-ordinarily elegant gentlemen in knee breeches and three-cornered hats … The whole of history was like that in my mind -- a series of completely different periods changing abruptly at the end of a century, or at any rate at some sharply defined date.”
His complaint was that chopping up history and simplistically labeling the pieces was a terrible way to teach the subject. It is a sentiment one can share now only up to a point. Orwell had been an average student; today it would be the mark of a successful American school district if the average student knew that the Renaissance came after the Middle Ages, much less that it started around 1500. (A thick black line separates his day and ours, drawn at 1950, when television sales started to boom.)
Besides, the disadvantages of possessing a schematic or clichéd notion of history are small by contrast to the pleasure that may come later, from learning that the past was richer (and the borders between periods more porous) than the scroll made it appear.
Must We Divide History Into Periods? asked Jacques Le Goff in the title of his last book, published in France shortly before his death in April 2014 and translated by M. B. DeBevoise for the European Perspectives series from Columbia University Press. A director of studies at L'École des Hautes Études en Sciences Sociales in Paris, Le Goff was a prolific and influential historian with a particular interest in medieval European cities. He belonged to the Annales school of historians, which focused on social, economic and political developments over very long spans of time -- although his work also exhibits a close interest in medieval art, literature and philosophy (where changes were slow by modern standards, but faster than those in, say, agricultural technique).
Le Goff’s final book revisits ideas from his earlier work, but in a manner of relaxed erudition clearly meant to address people whose sense of the past is roughly that of young Orwell. And in fact it is that heavy mark on the scroll at the year 1500 -- the break between the Middle Ages and the Renaissance -- that Le Goff especially wants the student to reconsider. (I say “student” rather than “reader” because time with the book feels like sitting in a lecture hall with a memorably gifted teacher.)
He quotes one recent historian who draws the line a little earlier, with the voyage of Christopher Columbus in 1492: “The Middle Ages ended, the modern age dawned, and the world became suddenly larger.” Le Goff is not interested in the date but in the stark contrast that is always implied. Usually the Middle Ages are figured as “a period of obscurity whose outstanding characteristic was ignorance” -- happily dispelled by a new appreciation for ancient, secular literature and a sudden flourishing of artistic genius.
Calling something “medieval” is never a compliment; the image that comes to mind is probably that of a witch trial. By contrast, “Renaissance” would more typically evoke a page from Leonardo da Vinci’s notebooks. Such invidious comparison is not hard to challenge. Witch trials were rare in the Middle Ages, while the Malleus Maleficarum appeared in “the late fifteenth century,” Le Goff notes, “when the Renaissance was already well underway, according to its advocates.”
Given his expertise on the medieval city -- with its unique institutional innovation, the university -- Le Goff makes quick work of demolishing the notion of the Middle Ages having a perpetually bovine and stagnant cultural life. The status of the artist as someone “inspired by the desire to create something beautiful” who had “devoted his life to doing just this” in pursuit of “something more than a trade, nearer to a destiny,” is recognized by the 13th century. And a passage from John of Salisbury describes the upheaval underway in the 12th century, under the influence of Aristotle:
“Novelty was introduced everywhere, with innovations in grammar, changes in dialectic, rhetoric declared irrelevant and the rules of previous teachers expelled from the very sanctuary of philosophy to make way for the promulgation of new systems …”
I can’t say that the name meant anything to me before now, but the entry on John of Salisbury in the Stanford Encyclopedia of Philosophy makes it sound as if Metalogicon (the work just quoted) was the original higher ed polemic. It was “ostensibly written as a defense of the study of logic, grammar and rhetoric against the charges of the pseudonymous Cornificius and his followers. There was probably not a single person named Cornificius; more likely John was personifying common arguments against the value of a liberal education. The Cornificians believe that training is not relevant because rhetorical ability and intellectual acumen are natural gifts, and the study of language and logic does not help one to understand the world. These people want to seem rather than to be wise. Above all, they want to parlay the appearance of wisdom into lucrative careers. John puts up a vigorous defense of the need for a solid training in the liberal arts in order to actually become wise and succeed in the ‘real world.’”
That's something an Italian humanist might write four hundred years later to champion “the new learning” of the day. And that is no coincidence. Le Goff contends that “a number of renaissances, more or less extensive, more or less triumphant,” took place throughout the medieval era -- an elaboration of the argument by the American historian Charles H. Haskins in The Renaissance of the Twelfth Century (1927), a book that influenced scholars without, as Le Goff notes, having much effect on the larger public. The Renaissance, in short, was a renaissance -- one of many -- and in Le Goff’s judgment “the last subperiod of a long Middle Ages.”
So, no bold, clean strokeof the black Magic Marker; more of a watercolor smear, with more than one color in the mix. Le Goff treats the Middle Ages as having a degree of objective reality, insofar as certain social, economic, religious and political arrangements emerged and developed in Europe over roughly a thousand years.
At the same time, he reminds us that the practice of breaking up history into periods has its own history -- deriving, in its European varieties, from Judeo-Christian ideas, and laden with ideas of decline or regeneration. “Not only is the image of a historical period liable to vary over time,” he writes, “it always represents a judgment of value with regard to sequences of events that are grouped together in one way rather than another.”
I'm not entirely clear how, or if, he reconciled the claim to assess periodization on strictly social-scientific grounds with its status as a concept with roots in the religious imagination, but it's a good book that leaves the reader with interesting questions.
When I moved into administration after being a professor, a colleague who had made the same move years before told me to brace for the loss of my faculty friends.
Impossible, I argued -- we attended regular Friday cocktail hours, had fought and won battles across campus, supported each other across the thorny paths leading to tenure and promotion. We’d been through it all, and those are precisely the kinds of experiences that make for lasting relationships.
I was wrong. My colleague was right.
About this time in my career, I began noticing for the first time the term “incivility” in higher ed news. Perhaps I noticed it because for the first time, it rang true. Where once I had been respected as a caring teacher and a hardworking colleague, I was now viewed with suspicion.
Now perceived as someone out for personal glory and set on bungling things for everyone else, I began finding it difficult to interact with my department (where I still taught one course a semester). After my move to the administration building, returning to my home department was like returning to the house of ex-in-laws after a bad divorce -- everyone froze, smiled stiffly and waited for me to leave. This office had been my home for over 15 years.
Inside Higher Ed recently reprinted a letter to the Financial Times advice column “Dear Lucy,” in which a disgruntled faculty member inquired, “Should I plot the downfall of our dean?” The offense that inspired this angry faculty member to ask the question was this: while traveling to Asia with a special delegation from his university’s business school, he had been forced to sit in coach. Directly in his line of sight was his dean, seated in the much more comfortable business class. For this, the dean must be destroyed.
While not all of us in the academy engage in this kind of career terrorism, we have all at least witnessed or been privy to such schemes and dark plots. I have myself experienced deep frustration as a faculty member, feeling underpaid, overworked and underappreciated. Certainly I would not have been pleased to be dragging an administrator along to a professional meeting, either. And I would have especially been dreading the stiff small talk at baggage claim, or the forced chat at the evening’s cocktail hour.
Also quite familiar was the willingness on the part of a faculty member to assign the worst motives to that seat placement and to wish on the dean the disaster of losing his job. The language “plot the downfall” contains a kind of professional violence too often at play in relationships between faculty and administrators. It’s an aspect of academic life of which we should not be proud.
Over lunch I once asked a professor friend whom I admired a great deal -- with impeccable scholarship, this superb teacher made one of the highest faculty salaries and enjoyed research release time and summers away -- if s/he really thought “all administrators were bad people.” The answer came back an unblinking, “Yes.”
I was an administrator at the time.
“Administrator” does not signify “human being” in scenarios like the one I just described. In such cases of deep suspicion administrators are assumed to be self-centered mismanagers or, worse, bloodless careerists. And certainly not committed to the institutions they serve. This is precisely the objectifying of others that we teach our students to recognize and reject. It is a powerful tool for excusing or justifying hostility toward an identity position. And it was certainly at play inside that airplane.
As an administrator I am precisely the same person I was when I was on the faculty. Same strengths, same weaknesses, same commitments to my family, identical professional goals. And to be fair, when I made the move to administration, I did not lose every friend, and my next administrative post included some very nice relationships with faculty colleagues.
But there is no question that once I became an administrator, simply by virtue of being an administrator, I fell under the suspicion of many. Was I seeking power? Would I continue to value teaching? Had I lost my mind? Was it mere greed driving my decision?
The truth is, on most campuses, there are not pots of money squirreled away under deans’ desks; we don’t enjoy giant travel budgets or outsize benefits packages. And we continue to possess whatever powers of ratiocination we enjoyed before.
What’s different: we carry responsibilities across a college or unit that force difficult decisions that are quite visible and affect many people, and that will often result in deep disappointments for some while satisfying (even rewarding) others. Many deans/provosts admit that these are jobs few would actually want if they knew beforehand what they were getting into, because it can be difficult to exist happily on a campus under a cloud of suspicion, making decisions that will destroy your credibility with one half of the campus one week and render you despised by the other half the next.
Yet I chose administration because of these difficulties; they suit me. I love higher education, am committed to student success, deeply respect faculty and research. Me? I’m a fine (not great) scholar and a respected teacher, but my heart is in the institutional project and has been for a long time.
And while I have endured much incivility in my new administrative life, it would not be fair to attribute the loss of true friendships to mere malice or professional pettiness. In truth, we tried to stay close at first, but whatever suspicions my faculty friends harbored about administration in general hung between us, squashing conversation.
Friday night debriefs over cocktails became impossible if I attended. There were things I couldn’t share, gossip I could no longer indulge in -- guessing other deans’ motives, parsing the language of the president’s latest missive. We have all known the fun of harmless gossip with colleagues; it’s part of being close friends at work. Sadly, the nature of academe itself, with its intractable tension between faculty and administration, had rendered me an outsider, and I could never go back.
Academe has become known for its internecine warring, as a place fraught and gossipy and deeply bifurcated. And often not a little ridiculous. Certainly the gentleman asking Lucy’s advice above works in a place like that.
If you are a career academic reading this, you recognize what I’m saying. You’ve been on one or both sides of this business. As an administrator who still believes in the higher education enterprise, I’d ask faculty to think next time the impulse to plot strikes, and to remember that many of us in administration are just as competent as we were as faculty, and no matter where we are seated on a plane, still as human.
After 15 years as a professor of English, Kellie Bean became an associate dean and then provost.
Significant share of research papers have "guest" authors who didn't perform significant research or "ghost" authors who did but aren't named, study finds. Are grad students losing out? Are conflicts of interest hidden?