It was a fairly typical lunch at an academic conference in the East after the New Hampshire primary in 2008. There was a smattering of endowed professorships and international reputations at the table, perhaps eight academics in all.
Along with the sweet tea and penne pasta came the inevitable skewering of George W. Bush.
"Never has a president experienced such horrible poll approval numbers in the midst of a war," one professor quipped.
"That is, if you overlook Harry Truman," I interjected into an uncomfortable silence.
It was going to be that kind of meal.
Dessert made its appearance and talk turned to the relative merits of the developing college basketball season and presidential candidates. Hillary Clinton and Barack Obama were hotly debated – the state’s primary promised to be a pivotal one. Then it was onto the Republicans, and Mitt Romney’s name popped up.
"I couldn’t vote for a Mormon," one professor said. There was some polite (or perhaps impolite) head-bobbing. "It’s a cult. Very intolerant, and their opinions about women, and, well ... ” and his voice trailed off.
I mentioned I had just been hired at a college in the West with a sizeable student and local population of Mormons -- Idaho State University, in Pocatello. I wondered rhetorically whether anyone said the same thing in 1960 about voting for John F. Kennedy because he was Roman Catholic. Or for then-Senator Obama because he is African-American. There was that same uncomfortable silence again. I think they felt sorry for me.
I’ve attended numerous scholarly conferences since that lunch where Mormonism has been discussed, and it is amazing to confront snide and disdainful comments and even overt prejudice from intellectually and sophisticated academics. And it seems perfectly acceptable to express this bias. Mormons are abnormal, outside the mainstream; everybody knows that. They don’t drink alcohol and coffee. Their women are suppressed. They don’t like the cross, and their most holy book seems made up. And there’s that multiple-wives thing. At one session involving a discussion of Utah’s history, several dismissive comments were spoken, rather blithely and without any sense of embarrassment. Belittling comments were made about Mormons' abstemiousness, and there was a general negative undercurrent. The LDS Church was referred to as the Mormon Church, something many members object to. They don’t mind being called Mormons, but their church is the Church of Jesus Christ of Latter-day Saints or LDS Church. At least some of the professors who were making these remarks knew that.
Yes, Mormons do not embrace the cross as a symbol of Christianity, but it is because they consider it representing state-sanctioned execution and intense suffering. I regard it as a sacrifice on my behalf. Who’s right? Various Christian denominations think that during communion the wine and wafers actually are transformed into the body and blood of Christ – and over the centuries Christians have been derided as cannibals. I was raised to believe that the Eucharist represents the sacrifice of Jesus. Nothing more than different perspectives and beliefs.
Mormons are excoriated in popular culture (see: "The Simpsons") for the way their church was created by someone who was kind of a con man. And the translation of the Book of Mormon was accomplished with a hat. And the Golden Tablets have been lost. Hmmm. The stone tablets of the Ten Commandments were misplaced, too. And a burning bush talking? Really? It comes down to faith, as it should. Not some sort of ignorant bigotry.
Many of the academics consider themselves liberal, socially responsible, and broad-minded individuals, the repository of the best in America. They’re proud of themselves for voting for Barack Obama (a bit too smug maybe?). They would splutter and bluster and be generally outraged to be considered prejudiced. None would consider saying anything similar about African-Americans, Muslims, Jews, Native Americans . . . well, you get the idea. But anti-Mormonism is part of the same continuum that contains discrimination against any group. Why, then, is it allowable publicly express bias against Mormons?
In 2009, The Daily Beast compiled a listing of the top 25 safest and 25 most dangerous college campuses in America, based on two-year per capita data from 9,000 campuses with at least 6,000 students. The two states with the highest proportion of Mormons did pretty well in the safest category: #5 was Idaho State University, Pocatello, where I work; #13 was Utah State University, Logan, and #17 was Brigham Young University, Provo, Utah. No Utah or Idaho schools were on the most dangerous list.
And yet, nestled in the midst of all the good publicity, was this comment about BYU: "Joseph Smith’s golden plates would have been safe at Brigham Young." Would the Daily Beast have said this: “The tablets of the Ten Commandments would have been safe at Brandeis University" or "at Notre Dame University?” Not very likely. But this sort of flippant and biased comment about Mormons is somehow socially acceptable. Responsible people don’t use "Indian giver" anymore (and we shouldn't). But we Welch on deals and get away Scot-free. I have a sprinkling of Welsh and Scottish blood in me, and I don't appreciate those comments.
So what, exactly, is so awful about being Mormon?
Utah is about 72 percent Mormon, so it's a pretty good representation of Mormonism. Among the 50 states, Utah has the lowest child poverty rate, the lowest teen pregnancy rate, the third-lowest abortion rate, the third-highest high school graduate rate at 94 percent, the highest scores on Advanced Placement exams, fewest births to unwed mothers (also the highest overall birthrate), lowest cancer rate, lowest smoking rate, lowest per capita rate of alcohol use, and, arguably, the most comprehensive and universal state health insurance system in the U.S.
Furthermore, Mormons as a group have the lowest rates of violence and depression among religious groups, are seven times less likely to commit suicide (if active church members), and have the lowest divorce rates of any social-religious group. Sixty-five percent of Utah residents have personal computers, the highest penetration rate in the country. Crime has decreased in the state of Utah by anywhere from 15-18 percent over the past 10 years.
Mormon women are more likely to be employed in professional occupations than Catholic or Protestant women (similar to Jewish women) and more likely to graduate from college than Catholic or Protestant women (but less than Jewish women). One survey indicated Mormon women experience more orgasms and are more satisfied with their married lives than non-Mormons.
Plenty of religious groups – from Orthodox Jews to Orthodox Muslims to various mainstream Christian denominations – do not allow women full participation in the life of their church and communities. But disparaging Roman Catholics, for instance, because their church does not allow female clergy, isn’t a knee-jerk reaction to that faith. Yes, Mormon women wear less revealing clothing – no plunging necklines and short-shorts. But is modesty a bad thing?
Glenn Beck is a Mormon, but so is Harry Reid. Other famous Mormons are or were: Harmon Killebrew, Jack Dempsey, J. W. Marriott, Gladys Knight, the Osmonds, Butch Cassidy, and Eldridge Cleaver. What does that tell you about Mormonism? Absolutely nothing.
Sure, many people find it annoying to have Mormon missionaries knock on their doors. But what kind of moral and religious conviction must it take to devote up to two years of your life in service to a higher calling, whether it be community service or religious proselytizing? Isn’t this the sort of commitment we want to encourage in young people, who are too often accused of being selfish and jaded? Having students who have been to Mongolia, Paraguay, and Finland enrich my classes, not diminish them.
At about 13 million members, Mormons are a pretty large cult. So what is so bad about this “cult?” And a cult growing at almost exactly the same rate, decade by decade, as the original Christian church in the 1st and 2nd centuries. It makes no sense, but then bigotry doesn’t. Who wouldn’t want to be on those lists? Seems like good things to be, even if you can’t drink coffee and beer, wear more than one earring per ear, grow a beard (frowned upon only if you want to move up the church hierarchy), and show lots of cleavage. You can have as much hot chocolate and ice cream as you want, though, and I have embraced this provision enthusiastically.
When I first moved to Pocatello, I lived in a cul de sac and seven of my nine neighbors belonged to the LDS Church. Nobody tried to convert me. They invited me to church picnics – no pressure. My next-door neighbor spent nearly two hours one weekday morning (he was late to work) helping me restore my snow blower to life after five years in the humid South. Another helped flush and fix my sprinkler system. A third returned my dogs after they’d escaped. Several just showed up with family members to help me move in. A fourth one tossed me the keys to his Cadillac after the transmission in my Suburban disassembled on my driveway. "Bring it back when you don’t need it anymore," he said.
These are not the faces of intolerance and prejudice.
No. Those faces are in the academic mirror.
I was raised as a member of the United Church of Christ – the same denomination as President Obama and the Rev. Jeremiah Wright – and my sister is an ordained minister in the denomination. I am now Episcopalian. An uncle and aunt and several of my first cousins are Mormons; the first was converted while stationed with the Marine Corps in Hawaii.
Just why is it socially acceptable to denigrate and trivialize and insult a class of people as a class of people? They had a name for that sort of behavior and system in the South a few decades back. You may remember it. It was called Jim Crow.
Thomas C. Terry is associate professor of mass communication at Idaho State University.
Around this time last year, I noted with interest -- via that late-night bastion of (t)reason, "The Colbert Report" -- that the color-coded terror index, which has been an ongoing but perhaps absurd part of our recent lives, was being retired.
That index, which charts the level of terror threat, painted a fairly redundant rainbow of fear from green (mild -- having a cup of tea with radical groups who have given up all thoughts of violent protest) to red (extreme -- hunkered down hoping to be somewhere just outside the blast radius, gripping a bootleg Kalashnikov in case Al Qaeda knocks on the door).
The index had a largely hostile audience among many commentators and the American public, because it seemed to do little other than engender, if I might borrow some words from WB Yeats, a “terrified, vague” panic. It wasn’t particularly informative or functional. To select Green would have been an act of folly, a kind of professional or political hubris no career could possible survive if anything went wrong. Red, on the other hand, was little more than an admission of blind, hand-wringing panic. That left an elevated, cautious alert, the familiar yellows and oranges of our recent years, as the wise and inevitable response to a pervasive and undisclosed terror threat. And who needs a color chart to encourage that?
I grew up in Britain during the Irish Republican Army terror campaigns of the '70s and '80s, and I was only really made aware of my innate wariness when I moved to the United States in the '90s, and discovered a society largely unaware of the implications of terror in the domestic space.
But that has now largely changed by virtue of experience. The American public is largely schooled now in the anxieties of terror, by spectacular instances of it, whether home-grown in Oklahoma City, or international in New York or via the ever hungry cable news cycle.
And, unfortunately, because of those experiences, it has been forced into becoming an old hand at watchfulness. So, the terror alert, practically redundant, seemed largely an object lesson in the mass-marketing of anxiety: ultimately inducing elevated panic, some have posited, as a form of control.
Stephen Colbert’s thought-provoking suggestion noting the paralyzing effect of fear-mongering was that the retiring color code be updated and changed simply to "Quiet," or "Too Quiet."
But while those terror charts are now a thing of the past, a relic of older ways of thinking, destined to gather dust covertly in some office space, now otherwise empty because of austerity measures, or perhaps to swim peacefully in the quiet back waters of a frustrated think tank, I think there is a way for us to recycle, reuse and restore them — in higher education, and particularly in the liberal arts.
I think there is a place for them in the halls of academe, where we can mount the charts prominently close to office and teaching spaces. Of course, they should be edited for context. While academe has often played out as a context for personal or political terror, it seems more appropriate if the new fear index should be financial (and, paradoxically, thereby intellectual).
Green could read "low or no risk of financial exigency," though of course those days, if they ever existed, are long gone. I read about them, as a doctoral student in England, in David Lodge novels about the academic Promised Land in America, as one might read of the unicorn. Blue, a relaxed but guarded watchfulness, would mean that the conference expenses are still covered, but we might, collegially, forego dessert on the expense report. Elevated yellow tells us that it's time for faculty to tighten their belts, and forget about cost-of-living raises, or bold capital investment. Orange, that dreaded state of high financial exigency, might indicate furloughs, lost tenure lines, or even disappearing programs. Of course anything is possible, plausible or permissible when it hits “severe” red.
I can imagine that there are institutions and administrations all over which would gladly jump at the opportunity to place these prominently on display because of the chilling effect of fear, and the “terrified, vague” paralysis in which it often results. It would be a strong, visceral reminder that faculty should be quiet and grateful that we have our jobs. It would keep us meek while critical inroads are being made into faculty governance, academic standards, tenure, and full-time instruction.
It would stymie protest while traditional and essential parts of the academy like philosophy, history, literature and languages revert to a service function for a notional "liberal arts" core while students otherwise pursue their vocational certification in nursing, or air conditioning repair, or increasingly popular faddish pseudo-subjects like leadership.
As an industry we would put up with the growth of adjunct instruction, the increase in class sizes, the loss of civility in discourse, even the loss of discourse itself, along, of course, with the loss of instruction and instructors, to inanities like third-party computer software — as though something that belongs in the university were little more than checking off the boxes in a defensive driving class. Why, with one of those charts on the wall, we could replace entire language departments with Rosetta Stone, and entire English departments with carefully selected PBS programming. Why, even biology departments could be lost to interactive screenings of "Shark Week," and no one would be willing to say a word. Such is the effect of terror.
I don't mean to downplay the realities of the need for austerity as a response to real financial urgency. Just like the terrorist threat, it's out there, but it’s often not everywhere it seems to be. No doubt all of it would be easier to bear, however, if we had those advisory charts on our walls — even the steady gains of both administrative salaries, and administrative support staff, even in the midst of a profound state of financial exigency. After all, we want the very best to lead us out of this kind of financial crisis. And we all know, except in cases of faculty, of instruction, of academics, and, of course, of education in general, if you want the very best then it’s going to cost you.
David Mulry is chair of English and foreign languages at Schreiner University.
Established orthodoxy indicates that the ideal pedagogical method centers on small, discussion-based classes. Such a model enables "active learning" that, coupled with on-the-spot guidance from a skilled faculty member, is much more likely to change deep thought patterns than traditional lecture-based approaches. The emphasis shifts from the assimilation of content (and its regurgitation) to learning how to learn — how to be a better reader, how to think more critically and creatively, how to collaborate with others in the task of learning.
Few would doubt that this model sounds very appealing. Yet the experience of many educators inclines them to believe that it is unrealistic. Students, it seems, are generally too unmotivated to make it work. As a result, discussion falls into the all-too-familiar patterns: a mostly silent classroom tunes out as the same students dominate the conversation, more intent on getting “participation points” than advancing anyone’s understanding. Even worse, students apparently don’t regard the discussion model as an ideal and often actively prefer lectures. After all, why should they have to listen to the free associations of their peers when they’re paying a lot of money to have access to an expert? Thus, teaching evaluations often push us away from current thinking on “best practices.”
My early experience in teaching also led me to believe that the discussion model was overhyped. Yes, it could work in grad school or even in upper-level courses — but first-year students just didn’t know enough. Everything changed, though, when I started teaching at Shimer College, a small liberal arts institution in Chicago with a distinctive discussion-centered pedagogy based on a Great Books curriculum. Within the first few weeks of teaching there, I realized that the central problem with the pedagogical ideal of small, discussion-based classes is that hardly anyone is really doing it. Many pay lip service to it, but administrative pressure to increase class sizes and a lack of buy-in from faculty ensures that the ideal model always remains a supplement to more traditional methods.
What Shimer’s approach showed me is that if you’re going to do a discussion-centric model, it has to be the main event. I think of the skills required to succeed in such a pedagogical model as a foreign language—you can’t learn them in a handful of supplementary discussion sections per week. The very best way to learn them, of course, is through immersion. The way this works at Shimer is that every single class, from day one, is a small, discussion-centric class, where class participation accounts for roughly half of a student’s final grade. There is no way to “opt out” of the hard work of discussion: students have to figure out how to learn in this style if they are going to succeed at all.
First-year courses can certainly be difficult, though the amount of progress from the first to the second semester is often remarkable. All the familiar pitfalls of class discussion make an appearance: the vague free-association, the off-topic remarks, the sense of competing monologues that don’t quite come together into a real conversation. The faculty’s primary job in these class sessions isn’t so much to supply content as to help students get over these problems and become productive participants.
The key to cultivating a productive discussion, in my view, is Shimer’s Great Books curriculum. Many associate such curricula with cultural conservatism and a narrow focus on “dead white males,” but that is misleading. For me, the importance of the model stems from three crucial pedagogical advantages. First, it provides a center of reference and authority for the classroom other than the professor — or the students’ own personal opinions. The standard for whether students are on-topic is whether they can support their views from the text, and the standard for whether a remark is helpful is whether it advances our understanding of the text. Second, the emphasis on reading primary source texts means that the texts reward and require discussion. In contrast to a textbook or an introductory secondary source, primary sources don’t come “pre-digested” and must be worked on.
Third, it allows us to get past the dreaded “why are we reading this” syndrome: the model guarantees that the texts under discussion are always widely agreed to be worthy of attention. The exact configuration of core texts of course varies from school to school. St. John’s College, for example, one of the leading Great Books institutions in the country and an indispensable point of reference for all such programs, takes a basically chronological approach in its core reading list, with a strong emphasis on classical antiquity and without much concern for disciplinary boundaries — but still with considerable diversity, particularly in the modern period. Shimer’s core curriculum follows what’s known as the Hutchins model, which is divided into the three primary disciplinary areas of humanities, social sciences, and natural sciences, and includes a greater emphasis on more contemporary works. Other colleges use other models, but the shared feature is a concern to choose texts that students will agree that one “should” read — with no requirement than any text be written by someone who is dead, white, male, or any of the above.
The combination of the discussion model with the use of primary texts creates a situation where students are forced to take responsibility for their own education. Instead of getting the material pre-digested in the form of lectures and introductory textbooks, they have to grapple with it themselves. Class sessions then become a chance to actively work on the text with an experienced faculty member and a group of similarly-motivated peers. To me, the real turning point in a Shimer education comes when students come to fully understand this and hold themselves and each other accountable for their contributions. Things don’t automatically go smoothly after that point — people are people, after all, and personalities are bound to clash in unpredictable ways — but the older students generally take an active role in working to solve the problems that do arise, rather than tuning out when things don’t go to their liking.
All this leads me to believe that when the ideal model is used in a thorough-going, uncompromising way, it really is ideal. Yet I can already anticipate an objection: this all sounds great, but it would cost too much. In an era when colleges and universities are constantly trying to cut costs through large classes and online education, Shimer’s approach admittedly may seem unrealistic. I believe, however, that it’s not a matter of “cost” in an abstract sense, but rather a matter of priorities. At Shimer, the priority is classroom instruction, and everything else takes a back seat to that. We have no athletic programs, a relatively low number of administrators (with academic administrative responsibilities rotating among current faculty members), and no buildings to maintain (we lease space from the Illinois Institute of Technology). Faculty salaries are lower than average, but aside from a handful of courses (taught by semi-retired faculty members or administrators with academic expertise), all teaching is done by full-time faculty. Overall, Shimer manages to remain faithful to its model while keeping tuition levels comparable to other small liberal arts schools — without having the luxury of a large endowment.
Another possible objection is that the outcome is ideal because Shimer students are ideal — this would never work at a less selective institution. It is true that Shimer students, like the students at basically all small liberal arts colleges, tend to be more privileged by most measures. Even more crucial, in my view, is the fact that Shimer’s student body tends to be very self-selecting: students are very clear about what the college is offering, and they aren’t going to attend if they aren’t interested in our pedagogical model.
I’ve spoken of the lack of faculty buy-in at other institutions, but I think this points to an even more important factor: student buy-in. If students don’t care, if they’re enrolled for utilitarian reasons and have no intrinsic love of learning, they will most likely wind up failing — and dragging the class down with them. Hence it seems to me that less-selective institutions could offer an optional program for interested students, much like those at two of the City Colleges of Chicago (Harold Washington and Wilbur Wright Colleges). Shimer has worked with Harold Washington in particular for many years, and several of their Great Books students have ultimately finished their four-year degrees at Shimer as a result. Many other community colleges around the country have found success with Great Books programs as well.
The more difficult problem, though, is what to do with students who have the motivation, but are less academically prepared. Shimer deals with this in part through an innovative scholarship program where students come to campus for a day to simulate the kinds of discussion and writing we require — and they can earn a full-tuition scholarship on the strength of their performance alone, regardless of their official credentials. However, one could argue that that merely allows us to reach students who really do already have the skills, but haven’t signaled those skills in the accepted ways. One might suspect that something similar is going on in community college programs, which often tend to attract the more precocious students.
One potential solution might be to organize the core curriculum, at least in the early stages, explicitly around difficulty or accessibility. This might mean starting with more contemporary works (Toni Morrison rather than Shakespeare, for example) or works with more immediate contemporary relevance (Foucault’s Discipline and Punish rather than Kant’s Critique of Pure Reason). It might also mean focusing on works that appeal with particular urgency to one’s target population. For instance, a Great Books program serving students on the South Side of Chicago might do well to lead off with great works in the African-American tradition, then branch into other intellectual traditions with which those works are in dialogue.
More broadly, a new Great Books program that aims to serve underprepared students should be bold and experimental, ruthlessly cutting works that fail to reach students and reaching in unexpected directions for those that do. If one needs to start with films and graphic novels in order to get the discussion started, even that shouldn’t be out of bounds if one embraces the view that the point of the Great Books curriculum isn’t solely to represent a particular vision of our cultural heritage, but to cultivate a collaborative learning environment that allows and requires students to take an active role in their own education.
Developing ways to make this type of curriculum more widely available is hugely important as a matter of justice — why shouldn’t everyone have the opportunity to try their hand at the “ideal” pedagogical model? On my more cynical days, I do agree with the view that there are some students who are simply never going to be motivated enough to do this kind of work, who are in college just because their parents are making them, or because they feel like they vaguely “should” be, or because they want to get a good job. Yet I don’t think its idealistic or unrealistic to assume that there are students who really do love learning and who are coming to college to pursue that love, at least in part.
In fact, I think we should ask ourselves whether our supposed “realism” about students’ abilities and motivations is foreclosing the possibility for students to really blossom. We should consider the possibility that it is precisely the more passive instructional methods that we “realistically” embrace that in part produce the “reality” (boredom, instrumentalization of learning) that those methods are supposedly responding to. Under different circumstances, perhaps even some of my best Shimer students could have wound up resigning themselves to tuning out and resentfully waiting for the professor to just tell them what’s on the test — and by the same token, I suspect that some of those bored students could be successful in a model like Shimer’s if given the chance.