Humanities

Colleges award tenure

Smart Title: 

The following individuals have recently been awarded tenure by their colleges and universities:

Marian University

  • Marilyn Bugenhagen, education
  • Greta Kostac, nursing
  • Abbey Rosen, chemistry
  • Matthew Szromba, history

Marquette University

Essay on answering questions about why you want to work at a specific college

Kathryn Hume says you need to focus on the match between you and the specific department and institution, not general issues about why you want a job.

Job Tags: 
Ad keywords: 

Colleges award tenure

Smart Title: 

The following individuals have recently been awarded tenure by their colleges and universities:

Alma College

  • Dana Aspinall, English
  • Kathryn Blanchard, religious studies
  • Zhewei Dai, mathematics and computer science
  • Thomas Ealey, business administration
  • Andrew Thall, mathematics and computer science

Lakeland College

Faculty salaries are up 1.9 percent, study finds

Smart Title: 

Private colleges outpaced publics in size of salary increases, but both lagged inflation, study of four-year institutions finds.

Column on Twitter and scholarly citation

Intellectual Affairs

The Modern Language Association has now issued its official, authoritative, and precisely calibrated guidelines for citing tweets – a matter left unaddressed in the seventh edition of the MLA Handbook (2009). The blogs have been -- you probably see this one coming -- all a-Twitter. The announcement was unexpected, eliciting comments that range from “this is really exciting to me and i don’t know why” to "holy moly i hate the world read a damn book." (Expressions of an old-school humanistic sensibility are all the more poignant sans punctuation.) Somewhere in between, there’s this: "when academia and the internet collide, i am almost always amused."

Yet the real surprise here is that anyone is surprised. The MLA is getting into the game fairly late. The American Psychological Association has had a format for citing both Twitter and Facebook since 2009. Last summer, the American Medical Association announced its citation style after carefully considering "whether Twitter constituted a standard citable source or was more in the realm of ‘personal communications’ (such as e-mail),” finally deciding that tweets are public discourse rather than private expression.

The AMA Style Insider noted that a standard format for Twitter references should “help avoid citations sounding like a cacophony of Angry Birds.”

How long was the possibility of an MLA citation format been under consideration? Was it a response to MLA members needing and demanding a way to bibliograph tweets, or rather an effort to anticipate future needs? Rosemary Feal, the organization’s executive director, was the obvious person to ask.

"The release of the tweet citation style,” she said by e-mail, “came in response to repeated requests from teachers, students, and scholars (most of them received, perhaps unsurprisingly, over Twitter). We debated the particulars on staff for some weeks. We're certain that the format we've announced is just a first step; user needs will change over time, as will technologies.”

Having exact, authoritatively formulated rules is clearly an urgent, even an anxiety-inducing matter for the MLA’s constituency. “Every time people asked me on Twitter about citing tweets,” Feal said, “I told them MLA style was flexible. Just adapt the format.” And as a matter of fact, the current MLA Handbook does have a format for citing blog entries – which would seem to apply, given that Twitter is a microblog.

“But because people wanted something very specific,” Feal said, “I asked staff to think about it…. Our hope is to remain nimble enough to respond to circumstances as they develop.” In that case, it might be time to start brainstorming how to cite Facebook exchanges, which can certainly be recondite enough, if the right people are involved. At least the Twitter citation format will be part of the eighth edition of the MLA Handbook -- though Feal indicated it would take at least another year to finish it.

Directing scholarly attention to the incessant flow of 140-character Twitter texts can yield far more substantial results than you might imagine, as explained in this column almost two years ago. Often this involves gathering tweets by the thousands and squeezing them hard, via software, to extract raw data, like so much juice from a vat of grapes. Add the yeast of statistical methodology, and it then ferments into the fine wine of an analogy that’s already gone on far too long.

So let’s try that again. Social scientists have ways of charting trends and finding correlations in tweets en masse. Fair enough. But recent work by Kaitlin L. Costello and Jason Priem points in a different direction: towards Twitter’s role in the more narrowly channeled and discussions taking pace within scholarly networks.

Costello and Priem, who are graduate students in the information and library science at the University of North Carolina at Chapel Hill, have been gathering and analyzing information about academics who tweet. Their findings suggest that Twitter has become a distinct and useful -- if exceedingly concentrated -- mode of serious intellectual exchange.

In one study, they examined the departmental web pages at five universities in the United States and Britain, compiling “a list of all the scholars (defined as full-time faculty, postdocs, and doctoral students) at each one, yielding a sample of 8,826.” Through a process of elimination, they were able to generate a pool of 230 scholars with active Twitter accounts. Out of the initial pool, then, they found one scholar in 40 using Twitter – not a lot, although it’s definitely an underestimation. Some in the pool were removed because Costello and Priem could not establish a link between faculty listing and Twitter profile beyond any doubt. (In the case of people with extremely common names, they didn’t even try.)

The most striking finding is that the scholars who used Twitter were almost indistinguishable from those who didn’t. Status as faculty or nonfaculty made no difference. Natural scientists, social scientists, and humanists were represented among the Twitterati at rates nearly identical to their share of the non-tweeting academic population. Scholars in the formal sciences (math, logic, comp sci, etc.) proved less likely to use Twitter than their colleagues – though only slightly.

A large majority of tweets by academics, about 60 percent, were of a non-scholarly nature. A given tweet by a faculty member was about twice as likely to have some scholarly relevance than one by a nonfaculty person. While the share of traffic devoted to strictly scholarly matters is not enormous, its importance shouldn’t be underestimated – especially since a significant portion of it involves the exchange of links to new publications.

In an earlier study (archived here) Costello and Priem conducted interviews with 28 scholars – seven scientists, seven humanists, and 14 social scientists – as well as harvesting more than 46,000 of their tweets. For each subject, they created a set of the 100 most recent tweets containing links that were still active. (A few didn’t reach the 100 mark, but their data was still useful.)

Six percent of the tweets containing hyperlinks fell into the category of what Priem and Costello call “Twitter citations” of peer-reviewed scholarly articles available online. One of their subjects compared linking to a scholarly article via Twitters to citing it in a classroom or seminar setting: “It’s about pointing people in the direction of things they would find interesting, rather than using it as evidence for something.”

At the same time, tweeting plays a role in disseminating new work in particular: 39 percent of the links were to articles less than a week old -- with 15 percent being to things published the same day.

The researchers divided citation tweets evenly into two categories of roughly equal sizes: direct links to an article, and links to blog entries or other intermediary pages that discussing an article (usually with a link to it). Not surprisingly, 56 percent of direct links lead to open-access sources. About three-quarters of the indirect links went to material behind a paywall. “As long as intermediary webpages provide even an abstract-level description,” write C&P, "our participants often viewed them as equivalent.”

One scholar told them: “I don’t have time to look at everything. But I trust [the people I follow] and they trust me to contribute to the conversation of what to pay attention to. So yes, Twitter definitely helps filter the literature.” Another said, “It’s like I have a stream of lit review going.”

At this level, Twitter, or rather its users, create a quasi-public arena for the distribution of scholarship – and, to some degree, even for its evaluation. Costello and Priem suggest that harvesting and analyzing these citations could yield “faster, broader, and more nuanced metrics of scholarly communication to supplement traditional citation analysis,” as well as strengthening “real-time article recommendation engines.”

At the MLA convention in January 2011, Amanda French gave a talk that summed up, in its title, a major implication of Priem and Costello’s work: “Your Twitter Followers and Facebook Friends Won’t Read Your Peer-Reviewed Article If They Have to Pay For It, and Neither Will Strangers.” This is true. And its obvious corollary – that open-access and scholarly tweeting can magnify an article’s impact considerably – is demonstrated by Melissa Terras, the co-director of the Center for Digital Humanities at the University College London.

On October 16, she made one of her papers available through the UCL online repository. Two people downloaded it. She tweeted and blogged about it on a Friday, whereupon it was downloaded 140 times in short order, then re-tweeted it on Monday, with the same effect. “I have no idea what happened on the 24th October,” she writes. “Someone must have linked to it? Posted it on a blog? Then there were a further 80 downloads. Then the traditional long tail, then it all goes quiet.”

In all, more than 800 people added the article to their to-read collections in a couple of months – which, for a two-year old paper called "Digital Curiosities: Resource Creation Via Amateur Digitisation," from the journal Literary and Linguistic Computing, is not bad at all.

That may be another reason why citation formats for Twitter are necessary. One day, and it might be soon, an intellectual historian narrating the development of a theory or argument may have to discuss someone’s extremely influential tweet. Stranger things have happened.

 

Essay on the role of the dictionary

Sometimes I get a little fancy in the final comment of a student paper. Usually my comments are pretty direct: two or three things I like about the paper, two or three things I think need revision, and two or three remarks about style or correctness. But once in a while, out of boredom or inspiration, I grasp for a simile or a metaphor. Recently I found myself writing, "Successfully rebutting counter-arguments is not unlike slaying a hydra.”

I started with great confidence, but suddenly I wasn’t so sure I knew what a hydra is: a multiheaded creature? Yes.  But how many heads?  And can I use the word generically or do I have to capitalize it?  Would “slaying the Hydra” be the correct expression?

Since I have no Internet connectivity at home, never have, and don’t miss it, I grabbed my Webster’s Seventh New Collegiate Dictionary from 1965 — the kind of dictionary you can get for free at the dump or from a curbside box of discarded books — and looked up hydra. On my way to hydra, however, I got hung up on horse, startled by a picture of a horse busily covered with numbers. I knew a horse has a face, a forehead, a mouth. A nose, ears, nostrils, a neck.  A mane, hooves, a tail.

Pressed for more parts, I might have guessed that a horse had a lower jaw, a forelock (which I would have described as a tuft of hair between the ears), cheeks, ribs, a breast, haunches, buttocks, knees, a belly.

I don’t think I would have guessed flank, loin, thighs, and shoulders, words I associate with other animals, humans, or cuts of meat.  I know I wouldn’t have guessed forearm or elbow.

What I’d thought of as an animal with a head, a mane, a tail, hooves, and a body has 36 separate parts, it seems, all enumerated in a simple design on page 401 of my dictionary.  Had I not forgotten the precise definition of a hydra, I may never have learned that a horse also has a poll, withers, a croup, a gaskin, a stifle, fetlocks, coronets, pasterns, and cannons.  (The withers are the ridge between a horse’s shoulder bones.)

Hoof is defined and illustrated on the page opposite the horse, an alphabetical coincidence.  That picture too caught my eye, now that I was in an equine frame of mind.  For the moment, I wanted to learn everything I could about the horse. The unshod hoof, it turns out, has a wall with four parts — the toe, the sidewalls, quarters, and buttresses — a white line, bars, a sole, and a frog, behind which lie the bulbs.

Eventually I returned to my original search. A Hydra with a capital H is a nine-headed monster of Greek mythology whose power lies in its regenerative abilities: if one head is cut off, two will grow in its place unless the wound is cauterized. With a lower case h, the word stands for a multifarious evil that cannot be overcome by a single effort. After all this dictionary work, I’m not sure hydra is the word I want.

I've been thinking about dictionaries lately. The writing center at Smith College, where I work, is transitioning from paper schedules to an online appointment system, and yesterday we spent part of the morning moving furniture around trying to create room for a new computer station dedicated to scheduling. One of my younger colleagues suggested getting rid of the dictionary stand, which, he said, "nobody uses." I bristled. It’s a beautiful thing, the dictionary, an oversize third edition of the American Heritage Dictionary, just a hair over 2,000 pages. For more than a dozen years it’s resided in a cozy nook on a well-lit lectern below a framed poster publicizing the 1994 Annual Katharine Ashen Engel Lecture by Murray Kiteley, then Sophia Smith Professor of Philosophy. The poster was chosen as much for its elegance as for the lecture’s title: "Parts of Speech, Parts of the World: A Match Made in Heaven? Or Just Athens?"

For years I had an office across from the dictionary and never used it myself, preferring the handiness of my taped-up 1958 American College Dictionary by Random House. The American Heritage is too massive. It takes me too long to find a word and I get easily distracted: by illustrations and unusual words. I continue to find my college dictionary completely adequate for my purposes. I’ve never needed a word that I couldn’t find in it.  

Another colleague within earshot spoke up for the American Heritage, claiming he used it once in a while. "Maybe," I thought. More likely, he didn’t want to contemplate the loss of the big dictionary while he still mourned the loss of the blue paper schedules. The dictionary stayed: words, that’s what a writing center is about, and the dictionary is where they live.

I cannot remember the last time I saw one of my students using a paper dictionary, much less one carrying one around, not even an international student. Have today’s students ever instinctively pulled out a paper dictionary and used it to look up a word or check its spelling? Is a paper dictionary as quaint as a typewriter? Have things changed that much? I wonder. Is it partly my fault? It’s been many years, after all, since I’ve listed "a college dictionary" among the required texts for my writing course.

I doubt my students use dictionaries much, of whatever kind. You have to care about words to reach for the dictionary, and I don’t think they care very much about words. At their age, I probably didn’t either, though I think I did care more about right and wrong. I was embarrassed when I used the wrong word or misspelled a word. I still remember the embarrassment of spelling sophisticated with an f in a college paper, something a modern spell checker doesn’t allow. But it does allow "discreet categories" for "discrete categories," another unforgettably embarrassing error — this one in graduate school!

My students appear cheerfully to accept whatever the spell checker suggests, or whatever word sounds like the one they want, especially if they’re in roughly the same semantic domain.  They are positively proud to confess that they’re bad spellers — who among them isn’t? — and really don’t seem to care much that they have used the wrong word.  Words don’t appear to be things you choose anymore. They’re things that pop up:  in autocorrect, in spell checkers, in synonym menus. They are not things you ponder over, they are things you click, or worse, your laptop decides to click for you.

When I meet with a student about her paper, we always work with a paper copy. Even so, more often than not I still have to remind her to take a pencil so she can annotate her draft as we discuss it. Toward the end of our meetings, we talk about word choice and the exchange often goes like this:

"Is this the word you want?"

"I think so."

"I think here you might have meant to say blah."

"Oh, yeah, that’s right" and out comes the pencil — scratch this, scribble that, lest it affect her final grade. No consideration, no embarrassment. I used to pull out the dictionary "to inculcate good habits," but no more. In the presence of today’s students, pulling out a dictionary feels as remote as pulling out a typewriter or playing a record.

Sometimes the situation is not so clear-cut. The student might, for example, write a word like security in a context where it makes a bit of sense, but after some gentle prodding and, yes, a few pointed suggestions, she might decide that what she really means is privacy. Out comes the pencil again. Scratch "security," scribble "privacy." What she really means is safety, though, I think, but I let it go. If I push too hard, she’ll stop thinking I'm being helpful and begin to think I have a problem: "What a nitpicker! The man’s obsessed with words!" I imagine her complaining to her friends. "But it matters! It matters!" goes the imaginary dialogue. "What precisely were the opponents of the ERA arguing, that it would violate security, invade privacy, or threaten safety?"

I have used the online Webster's on occasion, of course, and recognize the advantages of online dictionaries: They can be kept up-to-date more easily, they can give us access to more words than a standard portable dictionary, they can be accessed anywhere at any time, they take up no shelf space, etc. I'm not prejudiced against online reference tools. In fact, unlike many of my colleagues, I'm a great fan of online encyclopedias and a lover of Wikipedia. Online dictionaries leave me cold, though. They should fill me with awe the way Wikipedia sometimes does, but they don't. I marvel at the invention of the dictionary every time I look up a word in my paper copy; at the brilliant evolutionary step of such a book; at the effort of generations of scholars, professionals and lay people that led to such a comprehensive compendium of words; at how much information — and not just word meanings — it puts at my fingertips; at how much I still have to learn; and at how much my education could still be enhanced if I read my college dictionary cover to cover.

I think of The Autobiography of Malcolm X, in which the author makes a powerful statement about the dictionary as a pedagogical tool. Frustrated with his inarticulateness in writing while in prison and his inability to take charge of a conversation like his fellow inmate Bimbi, Malcolm X came to the conclusion that what he needed was "to get hold of a dictionary — to study, to learn some words." The experience was a revelation: "I’d never realized so many words existed!" He started at the beginning and read on, learning not just words but also history — about people, places, and events. "Actually the dictionary is like a miniature encyclopedia," he noted. The dictionary was the start of his "homemade education."

Online all I get is quick definition of the word I want, and I’m done. On paper I get the definition plus something akin to a small education along the way. The experience is not unlike that of slaying the Hydra: For every word I word I look up, I see two others whose meaning I don’t know. If I were Hercules I could put an end to the battle once and for all, but I’m not, and glad I’m not. The battle is far too delicious. But how to convince my students?

Julio Alves is the director of the Jacobson Center for Writing, Teaching and Learning at Smith College.

Essay on how to give a job talk

Remember your audience, kill the jargon and keep everything understandable, writes Kathryn Hume.

Job Tags: 
Ad keywords: 

Essay on similarities between starting as a parent and a professor

Afshan Jafar considers the similarities between starting a family and an academic career.

Job Tags: 
Ad keywords: 

Essay on ethical questions raised by letters of recommendation

Trysh Travis considers when it may not be in students' interests to write them letters of recommendation.

Job Tags: 
Ad keywords: 

Essay on need for evangelical scholars to reclaim Christian thought from fundamentalism

This spring semester, California’s Biola University, among the nation’s largest evangelical institutions, opens the doors of its ambitious new Center for Christian Thought. Resembling institutions such as Princeton’s Institute for Advanced Study, Biola’s center seeks to bring a mix of senior and postdoctoral fellows to campus to collaborate with internal fellows and faculty.

The center is unusual in operating from a distinctly Christian vantage point.  The mission statement is forthright: “The Center offers scholars from a variety of Christian perspectives a unique opportunity to work collaboratively on a selected theme.... Ultimately, the collaborative work will result in scholarly and popular-level materials, providing the broader culture with thoughtful Christian perspectives on current events, ethical concerns, and social trends.”

If the idea of Christian perspectives raises your eyebrows, it might be time to brush up on Augustine, Aquinas, Dante, Pascal, Kierkegaard, Dostoyevsky, Karl Barth, Martin Luther King, Edith Stein, Reinhold Niebuhr, and many others.  Consider, too, the recent scholarship of historians such as Mark Noll, Philip Jenkins, and the Pulitzer Prize winner Edward Larson; political theorists such as Jean Bethke Elshtain and Oliver O’Donovan; scientists such as Sir John Polkinghorne, Francis Collins, and physics Nobel laureate William Phillips; and philosophers such as Charles Taylor, Nicholas Wolterstorff and Alvin Plantinga.

Wolterstorff of Yale and Plantinga of Notre Dame, in fact, joined Biola recently for the inauguration of the Center, conducting a seminar with fellows focused on the Center’s first theme, “Christian Scholarship in the 21st Century: Prospects and Perils.”

Biola’s center is the latest chapter in a comeback of the “evangelical mind.”  While serious scholarship by self-professed evangelical Christians did not disappear entirely in the 20th century, it went into eclipse in the postwar period.  These decades, especially 1960-1980, saw the high-water mark for Western secularism when, contrary to subsequent evidence of religion’s persistence, Time Magazine in 1966 asked on its cover “Is God Dead?”  Social scientists in The New York Times confidently predicted in 1968 that “by the 21st century religious believers are likely to be small sects, huddled together to resist a worldwide secular culture.” 

But of course a funny thing has happened on the way to the 21st century: God and religion came back, and institutions such as Biola are capitalizing on the rediscovery of homo religiosus, both as an object of inquiry and, more relevant for the case at hand, as an inquiring subject.

The eclipse of Christian thought in the 20th century did not derive entirely from the inattention of secularists.  It can also be attributed to evangelicals themselves, insofar as many individuals and institutions clung to some of the more problematic tenets of “Fundamentalism” (originally a term of honor), which had defined itself against “Modernism” in American Protestantism’s epic internecine conflict that played out in the early 20th century, culminating in the Scopes “monkey” trial in 1925. 

At stake was the interpretation of the Bible. Liberal Protestants, “Modernists,” were attracted to both Darwin’s theory of evolution and historical criticism of the Bible, wafting across the Atlantic, primarily from German universities.  “Fundamentalists,” on the other hand, opposed these currents, convinced that they represented a mortal threat to what had recently become known as the Bible’s “inerrancy.”  Founded in 1908, Biola was squarely in the Fundamentalist camp.  (Its first dean, R. A. Torrey, in fact, was a major contributor to The Fundamentals [1910-15], the multivolume “statement” of Protestant Fundamentalism, published at Biola, then called the Bible Institute of Los Angeles.)

Stung by ridicule after the Scopes trial, Fundamentalists retreated to the sidelines of American culture.  There they nurtured a parallel universe of publishing houses, magazines, journals, radio stations, and, not least, colleges and universities to combat the threat of secularism from without and the threat of theological modernism from within.  One might see this as little more than the predictable, age-old flight of obscurantism from enlightenment.  But Fundamentalists were not without good reasons to consider their retreat as necessary to protect Christian supernaturalism and the authority of the Bible from the acids of modernity that they believed were corroding the pulpit and pew of fellow believers. 

Fundamentalists carried into exile many core tenets of Christian orthodoxy -- the Trinity, the Incarnation, the Atonement -- shared by Catholic and Orthodox Christians as well.  But they also carried dubious novelties, such as newfangled teachings on biblical inerrancy and speculations about the End Times.  What is more, they became pointedly hostile toward American culture and disengaged from serious intellectual pursuits, convinced that Christianity was almost exclusively about “the world to come,” with only negligible concern for the here-and-now.

All of this has begun to change in the past quarter century: evangelical Christians have been shedding their “fundamentalist baggage” and reclaiming a place within deeper traditions of Christian learning and at the table of American cultural life.  Signs abound of this recent shift, clearly in evidence by the mid-1990s.  In 1994 Mark Noll (formerly of Wheaton College in Illinois, now holding an endowed chair at Notre Dame) published The Scandal of the Evangelical Mind, calling evangelicals to repent of past anti-intellectualism and honor the Creator of their minds with first-order inquiry and creative expression.  The book became a manifesto of sorts for younger evangelicals attracted to the life of the mind.  Nineteen ninety-four also witnessed the publication of George Marsden’s The Soul of the American University: From Protestant Establishment to Established Nonbelief, analyzing the secularization of mainline Protestant universities and offering a blueprint for revitalized “Christian scholarship.”  

In 1995 the journal Books & Culture, was launched; it has become a leading organ of evangelical thought.  Significant funding initiatives of the Pew Charitable Trusts and the Lilly Endowment — such as the Lilly Fellows Program at Valparaiso University — also empowered a new generation of engaged Christian scholars, including evangelicals.  These developments together with the influence of scholars like Wolterstorff and Plantinga, and the emergence of evangelical Christians into key places of academic leadership — such as the presidencies of Nathan Hatch at Wake Forest and Ken Starr at Baylor — put a new face on evangelicalism.  As such, it bears little resemblance to your grandmother’s backwoods open-tent revival anymore, but represents, to quote the title of a much-regarded book by D. Michael Lindsay, president of Gordon College, Faith in the Halls of Power: How Evangelicals Joined the American Elite.

Periphery movements seeking the legitimacy of the center crave the approbation of others.  This has been true of the evangelical intellectual resurgence (sometimes to the point of obsequiousness).  It has not been remiss in coming.  In 2000, the movement received a boost from Alan Wolfe’s cover story in The Atlantic Monthly, “The Opening of the Evangelical Mind,” in which he argued that evangelicals, long the wayward stepchildren of serious Christian thought, had begun at last to exhibit some intellectual heft.  Catholics, too, have taken notice.  Writing in Commonweal, the historian James Turner of Notre Dame described contemporary evangelical intellectual life as “something to be reckoned with.”  And the impact has begun to be felt in the academy at large, as C. John Sommerville indicates in his book The Decline of the Secular University.

The Unwelcome Ghost of Fundamentalism

Is the launch of Biola’s Center for Christian Thought a victory lap for American evangelical intellectual life or at least another level attained on the purgatorial ascent toward intellectual respectability?  The answer is as complicated as the question is timely.

It should not go unacknowledged, however, that the desire for respectability is fraught with dangers from the standpoint of Christian spirituality.  In John Bunyan’s Pilgrim’s Progress, one of the more dangerous tempters encountered is Mr. Worldly Wiseman, who seeks to lure the protagonist, “Christian,” off the path toward the Celestial City, not by sin or heresy, but by compromising accommodations to moral duty, legality, and the approval of “the world.”  C. S. Lewis argues a similar point in his essay “The Inner Ring”; nothing will corrupt a good man as incrementally, imperceptibly, and thoroughly as when he is mastered by the desire to sit at the table of the wealthy, the influential, the respected.  Dante’s Inferno is populated by the educated and well-heeled. 

But beyond the problem of Mr. Worldly Wiseman is the problem of Biola itself.  The problem of Biola, however, is not the problem of Biola alone; it is shared by a number of the more than 115 evangelical schools in the Council for Christian Colleges and Universities (CCCU), the largest umbrella network of evangelical institutions of higher learning.  The problem is, quite simply, lingering attachment to some of the more dubious certainties and habits derived from Fundamentalism and hardened by the Fundamentalist-Modernist controversies of the 20th century. 

This presents two acute problems for the emerging evangelical mind.  First, in a well-intentioned effort to avoid “scientism” — the belief that all knowledge claims must conform to standards of evidence found in the “hard sciences” — it perpetuates skepticism about science itself.   Second, lingering fundamentalist accents put these institutions in a deficient and compromised position vis-à-vis more venerable and enduring resources of fides quarens intellectum, faith seeking understanding — traditions going back to the seminaries of the Reformation era, the universities and monasteries of the Middle Ages, and the earliest formulations of Christian teachings in the creeds and councils of the early church.  

This compromised position might be illuminated by examining Biola’s Doctrinal Statement.  While such statements should not be presumed to capture the actual range of belief on a given campus, they are crucial for understanding a school’s identity and history and how it wants to be understood by its constituents.  And since faculty at many evangelical colleges, such as Biola’s, are required to express agreement with doctrinal statements, they serve a gatekeeping function, even as they sometimes provoke dilemmas of conscience over the scope of possible interpretation. 

Biola’s statement expresses time-honored Christian doctrines — Creation, the Trinity, the divinity of Christ, and so on.  But it also contains some dubious innovations, pertaining to the Bible, especially in regard to teachings on eschatology or the End Times.  Few topics in the history of Christianity have been subject to more unhinged conjecture than this one, and America has recently witnessed a much-publicized forecast of Doomsday on May 21, 2011 (later unsuccessfully revised to October 21) by the end-times guru Harold Camping. 

Wise theologians encourage great caution in interpreting the opaque Scriptural passage that speak of an apocalypse.  The Biola statement, however, requires a definitive stance in favor of a spectacular end-times scenario brought to life in Tim LaHaye’s bestselling Left Behind novels.  Based on a theological scheme known as pre-millennial dispensationalist eschatology, this position holds that prior to the beginning of God’s Eternal Kingdom at the end of time, there will be a thousand-year reign of Christ on earth.  The nation of Israel will play a central role in bringing the blessings of salvation to all nations during the millennium in fulfillment of biblical prophecy.  What is more, a “rapture” of the sort predicted to occur on May 21/October 21 will take place, inaugurating the millennial kingdom. 

While not without antecedents, modern dispensationalist theology of this sort largely derives from the teachings one man: John Nelson Darby (1800-1882), an Irish minister who traveled to North America and led a small denomination known as the Plymouth Brethren (or Darbyites).  For reasons that still confound historians, Darby’s influence on conservative American Protestantism in the late 19th and the 20th centuries has been immense.  We largely have him to thank for the rapture fearmongering as expressed in books such as Hal Lindsay’s Late Great Planet Earth (among the bestselling books on any topic in the 1970s) and the Left Behind books, with sales in excess of 60 million, and the spin-off movies.  Such apocalypticism owes much to Darby’s interpretation of the biblical books of Daniel, Ezekiel, and Revelation, and of a single, cryptic passage in the New Testament, which speaks of believers being “caught up in the clouds” to meet the Lord in the sky (I Thessalonians 1:17). 

Biola, too, insists that its faculty affirm that “before … [the] millennial events, believers will be caught up to meet the Lord in the air.”  To piece all this together: the same institution that has unveiled this ambitious Center for Christian Thought shares a theological legacy with the folks who gave us Left Behind.

But the situation gets even stickier because this highly particularistic eschatology is often thought to be of a piece with biblical inerrancy, which is another problematic topic.  The idea of Scripture as being the authoritative, inspired word of God has enduring sanction in the Christian tradition, one embraced, mutatis mutandis, by Church fathers, Scholastic theologians, and Protestant reformers alike.  But this central affirmation took a questionable turn as a result of the Fundamentalist-Modernist controversies, with a view toward blocking any reconciliation of Darwinian evolution with the Genesis account of creation.   Accordingly, the first paragraph in Biola’s Doctrinal Statement reads: 

The Bible, consisting of all the books of the Old and New Testaments, is the Word of God, a supernaturally given revelation from God Himself, concerning Himself, His being, nature, character, will and purposes; and concerning man, his nature, need and duty and destiny.  The Scriptures of the Old and New Testaments are without error or misstatement in their moral and spiritual teaching and record of historical facts. They are without error or defect of any kind (emphases added).

Doubtlessly with the sincere intentions, Biola sought to build a firewall against those who presumed too much latitude in interpreting the creation story of human origins.  To further reduce wiggle room, a subsequent “Explanatory Note” warns against deficient understandings of human origins: “Inadequate origin models hold that (a) God never directly intervened in creating nature and/or (b) humans share a common physical ancestry with earlier life forms.” 

But the latter prohibition begs profound questions in light of recent work on human and other genomes.  Common ancestry today is, quite simply, as well-established in biology as the motion of the earth about the sun is in astronomy.  To attempt to exclude faculty who might hold this view is tantamount to closing one's eyes in the face of an encyclopedia of genetic information.  To be sure, philosophical naturalism or rejection of belief in the creational dignity of human beings does not necessarily follow from common ancestry, as thinkers such as Alvin Plantinga, Francis Collins, and Pope Benedict XVI have argued with great profundity; but the categorical denial of common ancestry puts Biola fundamentally at odds with the entire direction of modern biology.

But, again, Biola, is not an isolated case. Some CCCU colleges go still farther, mandating belief in a “Young Earth” view, a literal six-day creation.  The mission statement of Master’s College in California, for example, states: “We teach that the Word of God is... absolutely inerrant in the original documents, infallible, and God-breathed.  We teach the literal, grammatical-historical interpretation of Scripture which affirms the belief that the opening chapters of Genesis present creation in six literal days (Genesis 1:31; Exodus 31:17).”  Or, as Cedarville University in Ohio puts it: “We believe in the literal 6-day account of creation.”

The wording of faith statements on biblical inerrancy sometimes stress that the Bible is the “only” source of theological and ethical authority.  (By contrast, most 16th-century Protestant reformers saw it more like the “highest” authority.)  While designed to fend off Modernist Protestantism, which often took its cues from science and history, such language has, historically, succored evangelicalism’s longstanding opposition to Roman Catholicism -- which looks to its own magisterium for authority in interpreting Scripture.  Such inerrancy statements function to keep Catholics off the faculty at a number of evangelical institutions. 

Several years back, a cause célèbre unfolded at Wheaton College in Illinois, arguably evangelicalism’s flagship institution, when a philosophy professor, Joshua Hochshild, converted to Catholicism.  Appealing to Vatican II’s statement on the Bible, Dei Verbum, Hochschild indicated that he could still sign Wheaton’s statement of faith in good conscience.  That was not enough for Wheaton’s then president Duane Litfin, who, willy-nilly finding himself as the authoritative interpreter of the Catholic magisterium, gave Hochschild a year of grace before asking him to seek employment elsewhere. 

Cases like this are hot topics on some evangelical campuses, because Catholics have emerged as evangelicals’ most reliable partners on a host of moral and theological beliefs.  Witness, for example, the fervent evangelical support of (Catholic) Rick Santorum in the current Republican primary.  In the academy, Catholic writers and thinkers such as G. K. Chesterton, J. R. R. Tolkien, Thomas Merton, Flannery O’Connor, and John Paul II, and many others are widely trumpeted. 

So students increasingly find themselves scratching their heads when, upon finishing a term paper on, say, Mother Teresa’s charity, they discover that an invisible but very real “Catholics Need Not Apply” sign hangs over the door at Human Resources.  In an age of deepening Catholic-evangelical ecumenism, this might prove especially problematic for the evangelical intellectual revival, because, as D. Michael Lindsay argues, Catholic scholarship has been a “boon” and a “model” for evangelicals, who “now draw on a vast array of source material that is rooted in the Catholic tradition.”

But there is yet a thornier problem with statements of faith at many evangelical colleges: the priority given to declarations on the Bible and its inerrancy by placing them first, before other theological affirmations.  Here again, culpability rests with a pinched biblicism left over from Fundamentalism’s fiery struggle against Modernism.  But guarding against liberalism has had the unintended and unhappy consequence today of fostering a broader disengagement, separating many evangelical colleges, not just from liberal Protestantism, but from deeper and more enduring traditions of Christianity. 

Going back to the Nicene Creed of 325, Christian creeds have generally begun with a statement about the nature of God, not about the medium through which knowledge of Him is obtained.  “I believe in God the Father,” begins the Nicene Creed, setting the template.  In the 20th century, many evangelical colleges departed from this venerable tradition by beginning with a statement about the medium, and often as an expedient to identify “insiders” and “outsiders” in controversies over the Bible.  Statements about the Bible thus often function less at a theological level than as a social mechanism for “maintaining safe identity boundaries,” as the Notre Dame sociologist Christian Smith observes in his book, The Bible Made Impossible: Why Biblicism is Not a Truly Evangelical Reading of Scripture.

The Challenge of the Future

In recent years, much media attention has been devoted to the passing from the scene of a generation of older populist, firebrand evangelical leaders, such as Jerry Falwell, Pat Robertson, and James Dobson.  Far less attention has been devoted to an arguably more consequential sphere of influence for American evangelicalism: the retirement of leaders at key evangelical colleges and universities and an incoming new generation far less shaped by Modernist-Fundamentalist debates of yesterday.  These leaders often trenchantly perceive the tensions and problems outlined in this essay. 

But they find themselves in a classic Catch-22.  The future lies with continuing to exorcize the ghost of fundamentalism -- championing endeavors such as the Center for Christian Thought at Biola, but providing them with a more nourishing institutional theological environment.  Less Dispensationalism and biblicism, as one scholar has quipped, and more C. S. Lewis, Dietrich Bonhoeffer, and Martin Luther King.  The theological distortions of the recent past, however, weigh heavily on the present.  “Fundamentalist intellectual habits,” writes Mark Noll, “have been more resilient than fundamentalism itself.”  

What is more, many old-guard defenders of the status quo, convinced that the residue of fundamentalism is simply “what the Bible plainly teaches,” are not in short supply among donors, board members and vocal alumni.  They would likely perceive some changes such as admitting Catholic faculty, constructively engaging evolution, or modifying statements of faith away from simplistic biblicism as greasing the slippery slope toward perdition.

To be fair, old-guards worries are not entirely unfounded: imprudently pursuing reforms  would put some evangelical colleges at risk, setting them on the hackneyed path of becoming yet-another liberal arts college estranged from its founding religious mission.  If these schools are to maintain a distinctive mission, then judicious hiring practices and faith statements are not beside the point, not only to ensure a clear mission but — and one can argue this on liberal grounds — to foster a rich institutional diversity in American higher education.  But affirming the significance of a religiously distinctive identity can co-exist with the worry that some of the current lines have been drawn in self-defeating places.

The antidote to imprudence, of course, is not inaction, but prudence, one of the four cardinal virtues in the classical and Christian intellectual tradition.  Indeed, prudence should not be mistaken for caution or timorousness.  Rather, in the thought of Thomas Aquinas, it means knowing and pursuing the good in the most realistic, thoughtful way possible.  In the current climate of evangelical higher education, this also requires the virtue of courage; leaders must find ways to educate their colleges' constituents and not simply avoid offending them.  They must balance concern about donor pocketbooks and faithfulness to an institution’s particular heritage with a still a deeper faithfulness to the Christian faith itself and its profounder intellectual traditions.  In pursuing reforms, they must convince critics that they are not dishonoring a school’s legacy, but pruning it of spurious accretions for more durable growth in the future.

As is the case with most worthwhile pursuits, the opportunities to err abound, while the path to success is fraught with difficulties.  But Christians, of all people, should be accustomed to seeking the narrow way.  And if those in the secular academy would welcome institutions more likely to produce the next Bonhoeffer or Martin Luther King, instead of the next Falwell or Tim LaHaye, they, too, will wish evangelical colleges much success and Godspeed.

Thomas Albert Howard is the Stephen Phillips Chair of History at Gordon College, in Massachusetts, and author of God and the Atlantic: America, Europe, and the Religious Divide (Oxford, 2011), among other works. Karl W. Giberson runs a science and religion writing workshop at Gordon College and is author, with Randall Stephens, of The Anointed: Evangelical Truth in a Scientific Age (Belknap/Harvard University Press).

Pages

Subscribe to RSS - Humanities
Back to Top