As the Supreme Court gets ready to review the consideration of race in admissions policies, instructors need to think about how to manage discussions of the issue -- both those that are planned and those that are unplanned.
It was a fairly typical lunch at an academic conference in the East after the New Hampshire primary in 2008. There was a smattering of endowed professorships and international reputations at the table, perhaps eight academics in all.
Along with the sweet tea and penne pasta came the inevitable skewering of George W. Bush.
"Never has a president experienced such horrible poll approval numbers in the midst of a war," one professor quipped.
"That is, if you overlook Harry Truman," I interjected into an uncomfortable silence.
It was going to be that kind of meal.
Dessert made its appearance and talk turned to the relative merits of the developing college basketball season and presidential candidates. Hillary Clinton and Barack Obama were hotly debated – the state’s primary promised to be a pivotal one. Then it was onto the Republicans, and Mitt Romney’s name popped up.
"I couldn’t vote for a Mormon," one professor said. There was some polite (or perhaps impolite) head-bobbing. "It’s a cult. Very intolerant, and their opinions about women, and, well ... ” and his voice trailed off.
I mentioned I had just been hired at a college in the West with a sizeable student and local population of Mormons -- Idaho State University, in Pocatello. I wondered rhetorically whether anyone said the same thing in 1960 about voting for John F. Kennedy because he was Roman Catholic. Or for then-Senator Obama because he is African-American. There was that same uncomfortable silence again. I think they felt sorry for me.
I’ve attended numerous scholarly conferences since that lunch where Mormonism has been discussed, and it is amazing to confront snide and disdainful comments and even overt prejudice from intellectually and sophisticated academics. And it seems perfectly acceptable to express this bias. Mormons are abnormal, outside the mainstream; everybody knows that. They don’t drink alcohol and coffee. Their women are suppressed. They don’t like the cross, and their most holy book seems made up. And there’s that multiple-wives thing. At one session involving a discussion of Utah’s history, several dismissive comments were spoken, rather blithely and without any sense of embarrassment. Belittling comments were made about Mormons' abstemiousness, and there was a general negative undercurrent. The LDS Church was referred to as the Mormon Church, something many members object to. They don’t mind being called Mormons, but their church is the Church of Jesus Christ of Latter-day Saints or LDS Church. At least some of the professors who were making these remarks knew that.
Yes, Mormons do not embrace the cross as a symbol of Christianity, but it is because they consider it representing state-sanctioned execution and intense suffering. I regard it as a sacrifice on my behalf. Who’s right? Various Christian denominations think that during communion the wine and wafers actually are transformed into the body and blood of Christ – and over the centuries Christians have been derided as cannibals. I was raised to believe that the Eucharist represents the sacrifice of Jesus. Nothing more than different perspectives and beliefs.
Mormons are excoriated in popular culture (see: "The Simpsons") for the way their church was created by someone who was kind of a con man. And the translation of the Book of Mormon was accomplished with a hat. And the Golden Tablets have been lost. Hmmm. The stone tablets of the Ten Commandments were misplaced, too. And a burning bush talking? Really? It comes down to faith, as it should. Not some sort of ignorant bigotry.
Many of the academics consider themselves liberal, socially responsible, and broad-minded individuals, the repository of the best in America. They’re proud of themselves for voting for Barack Obama (a bit too smug maybe?). They would splutter and bluster and be generally outraged to be considered prejudiced. None would consider saying anything similar about African-Americans, Muslims, Jews, Native Americans . . . well, you get the idea. But anti-Mormonism is part of the same continuum that contains discrimination against any group. Why, then, is it allowable publicly express bias against Mormons?
In 2009, The Daily Beast compiled a listing of the top 25 safest and 25 most dangerous college campuses in America, based on two-year per capita data from 9,000 campuses with at least 6,000 students. The two states with the highest proportion of Mormons did pretty well in the safest category: #5 was Idaho State University, Pocatello, where I work; #13 was Utah State University, Logan, and #17 was Brigham Young University, Provo, Utah. No Utah or Idaho schools were on the most dangerous list.
And yet, nestled in the midst of all the good publicity, was this comment about BYU: "Joseph Smith’s golden plates would have been safe at Brigham Young." Would the Daily Beast have said this: “The tablets of the Ten Commandments would have been safe at Brandeis University" or "at Notre Dame University?” Not very likely. But this sort of flippant and biased comment about Mormons is somehow socially acceptable. Responsible people don’t use "Indian giver" anymore (and we shouldn't). But we Welch on deals and get away Scot-free. I have a sprinkling of Welsh and Scottish blood in me, and I don't appreciate those comments.
So what, exactly, is so awful about being Mormon?
Utah is about 72 percent Mormon, so it's a pretty good representation of Mormonism. Among the 50 states, Utah has the lowest child poverty rate, the lowest teen pregnancy rate, the third-lowest abortion rate, the third-highest high school graduate rate at 94 percent, the highest scores on Advanced Placement exams, fewest births to unwed mothers (also the highest overall birthrate), lowest cancer rate, lowest smoking rate, lowest per capita rate of alcohol use, and, arguably, the most comprehensive and universal state health insurance system in the U.S.
Furthermore, Mormons as a group have the lowest rates of violence and depression among religious groups, are seven times less likely to commit suicide (if active church members), and have the lowest divorce rates of any social-religious group. Sixty-five percent of Utah residents have personal computers, the highest penetration rate in the country. Crime has decreased in the state of Utah by anywhere from 15-18 percent over the past 10 years.
Mormon women are more likely to be employed in professional occupations than Catholic or Protestant women (similar to Jewish women) and more likely to graduate from college than Catholic or Protestant women (but less than Jewish women). One survey indicated Mormon women experience more orgasms and are more satisfied with their married lives than non-Mormons.
Plenty of religious groups – from Orthodox Jews to Orthodox Muslims to various mainstream Christian denominations – do not allow women full participation in the life of their church and communities. But disparaging Roman Catholics, for instance, because their church does not allow female clergy, isn’t a knee-jerk reaction to that faith. Yes, Mormon women wear less revealing clothing – no plunging necklines and short-shorts. But is modesty a bad thing?
Glenn Beck is a Mormon, but so is Harry Reid. Other famous Mormons are or were: Harmon Killebrew, Jack Dempsey, J. W. Marriott, Gladys Knight, the Osmonds, Butch Cassidy, and Eldridge Cleaver. What does that tell you about Mormonism? Absolutely nothing.
Sure, many people find it annoying to have Mormon missionaries knock on their doors. But what kind of moral and religious conviction must it take to devote up to two years of your life in service to a higher calling, whether it be community service or religious proselytizing? Isn’t this the sort of commitment we want to encourage in young people, who are too often accused of being selfish and jaded? Having students who have been to Mongolia, Paraguay, and Finland enrich my classes, not diminish them.
At about 13 million members, Mormons are a pretty large cult. So what is so bad about this “cult?” And a cult growing at almost exactly the same rate, decade by decade, as the original Christian church in the 1st and 2nd centuries. It makes no sense, but then bigotry doesn’t. Who wouldn’t want to be on those lists? Seems like good things to be, even if you can’t drink coffee and beer, wear more than one earring per ear, grow a beard (frowned upon only if you want to move up the church hierarchy), and show lots of cleavage. You can have as much hot chocolate and ice cream as you want, though, and I have embraced this provision enthusiastically.
When I first moved to Pocatello, I lived in a cul de sac and seven of my nine neighbors belonged to the LDS Church. Nobody tried to convert me. They invited me to church picnics – no pressure. My next-door neighbor spent nearly two hours one weekday morning (he was late to work) helping me restore my snow blower to life after five years in the humid South. Another helped flush and fix my sprinkler system. A third returned my dogs after they’d escaped. Several just showed up with family members to help me move in. A fourth one tossed me the keys to his Cadillac after the transmission in my Suburban disassembled on my driveway. "Bring it back when you don’t need it anymore," he said.
These are not the faces of intolerance and prejudice.
No. Those faces are in the academic mirror.
I was raised as a member of the United Church of Christ – the same denomination as President Obama and the Rev. Jeremiah Wright – and my sister is an ordained minister in the denomination. I am now Episcopalian. An uncle and aunt and several of my first cousins are Mormons; the first was converted while stationed with the Marine Corps in Hawaii.
Just why is it socially acceptable to denigrate and trivialize and insult a class of people as a class of people? They had a name for that sort of behavior and system in the South a few decades back. You may remember it. It was called Jim Crow.
Thomas C. Terry is associate professor of mass communication at Idaho State University.
Around this time last year, I noted with interest -- via that late-night bastion of (t)reason, "The Colbert Report" -- that the color-coded terror index, which has been an ongoing but perhaps absurd part of our recent lives, was being retired.
That index, which charts the level of terror threat, painted a fairly redundant rainbow of fear from green (mild -- having a cup of tea with radical groups who have given up all thoughts of violent protest) to red (extreme -- hunkered down hoping to be somewhere just outside the blast radius, gripping a bootleg Kalashnikov in case Al Qaeda knocks on the door).
The index had a largely hostile audience among many commentators and the American public, because it seemed to do little other than engender, if I might borrow some words from WB Yeats, a “terrified, vague” panic. It wasn’t particularly informative or functional. To select Green would have been an act of folly, a kind of professional or political hubris no career could possible survive if anything went wrong. Red, on the other hand, was little more than an admission of blind, hand-wringing panic. That left an elevated, cautious alert, the familiar yellows and oranges of our recent years, as the wise and inevitable response to a pervasive and undisclosed terror threat. And who needs a color chart to encourage that?
I grew up in Britain during the Irish Republican Army terror campaigns of the '70s and '80s, and I was only really made aware of my innate wariness when I moved to the United States in the '90s, and discovered a society largely unaware of the implications of terror in the domestic space.
But that has now largely changed by virtue of experience. The American public is largely schooled now in the anxieties of terror, by spectacular instances of it, whether home-grown in Oklahoma City, or international in New York or via the ever hungry cable news cycle.
And, unfortunately, because of those experiences, it has been forced into becoming an old hand at watchfulness. So, the terror alert, practically redundant, seemed largely an object lesson in the mass-marketing of anxiety: ultimately inducing elevated panic, some have posited, as a form of control.
Stephen Colbert’s thought-provoking suggestion noting the paralyzing effect of fear-mongering was that the retiring color code be updated and changed simply to "Quiet," or "Too Quiet."
But while those terror charts are now a thing of the past, a relic of older ways of thinking, destined to gather dust covertly in some office space, now otherwise empty because of austerity measures, or perhaps to swim peacefully in the quiet back waters of a frustrated think tank, I think there is a way for us to recycle, reuse and restore them — in higher education, and particularly in the liberal arts.
I think there is a place for them in the halls of academe, where we can mount the charts prominently close to office and teaching spaces. Of course, they should be edited for context. While academe has often played out as a context for personal or political terror, it seems more appropriate if the new fear index should be financial (and, paradoxically, thereby intellectual).
Green could read "low or no risk of financial exigency," though of course those days, if they ever existed, are long gone. I read about them, as a doctoral student in England, in David Lodge novels about the academic Promised Land in America, as one might read of the unicorn. Blue, a relaxed but guarded watchfulness, would mean that the conference expenses are still covered, but we might, collegially, forego dessert on the expense report. Elevated yellow tells us that it's time for faculty to tighten their belts, and forget about cost-of-living raises, or bold capital investment. Orange, that dreaded state of high financial exigency, might indicate furloughs, lost tenure lines, or even disappearing programs. Of course anything is possible, plausible or permissible when it hits “severe” red.
I can imagine that there are institutions and administrations all over which would gladly jump at the opportunity to place these prominently on display because of the chilling effect of fear, and the “terrified, vague” paralysis in which it often results. It would be a strong, visceral reminder that faculty should be quiet and grateful that we have our jobs. It would keep us meek while critical inroads are being made into faculty governance, academic standards, tenure, and full-time instruction.
It would stymie protest while traditional and essential parts of the academy like philosophy, history, literature and languages revert to a service function for a notional "liberal arts" core while students otherwise pursue their vocational certification in nursing, or air conditioning repair, or increasingly popular faddish pseudo-subjects like leadership.
As an industry we would put up with the growth of adjunct instruction, the increase in class sizes, the loss of civility in discourse, even the loss of discourse itself, along, of course, with the loss of instruction and instructors, to inanities like third-party computer software — as though something that belongs in the university were little more than checking off the boxes in a defensive driving class. Why, with one of those charts on the wall, we could replace entire language departments with Rosetta Stone, and entire English departments with carefully selected PBS programming. Why, even biology departments could be lost to interactive screenings of "Shark Week," and no one would be willing to say a word. Such is the effect of terror.
I don't mean to downplay the realities of the need for austerity as a response to real financial urgency. Just like the terrorist threat, it's out there, but it’s often not everywhere it seems to be. No doubt all of it would be easier to bear, however, if we had those advisory charts on our walls — even the steady gains of both administrative salaries, and administrative support staff, even in the midst of a profound state of financial exigency. After all, we want the very best to lead us out of this kind of financial crisis. And we all know, except in cases of faculty, of instruction, of academics, and, of course, of education in general, if you want the very best then it’s going to cost you.
David Mulry is chair of English and foreign languages at Schreiner University.