faculty

Review of Carl H. Klaus, A Self Made of Words: Crafting a Distinctive Persona in Nonfiction Writing

Intellectual Affairs

Not long ago, this column took up the perennial issue of academic prose and how it gets that way. On hand, fortunately, was Michael Billig’s Learn to Write Badly, a smart and shrewd volume that avoids mere complaint or satirical overkill.

Bad scholarly writing is, after all, something like Chevy Chase’s movie career. People think that making fun of it is like shooting fish in a barrel. But it’s not as easy as shooting fish in a barrel: to borrow Todd Berry’s assessment of his comedic colleague, “It’s as easy as looking at fish in a barrel. It’s as easy as being somewhere near a barrel.” Besides, it’s gone on for at least 500 years (the mockery began with Rabelais, if not before) so it’s not as if there are many new jokes on the subject.

But Billig did make an original and telling point in his critique of pure unreadability – one I neglected to emphasize in that earlier column. It has come into clearer view since then thanks to a new book by Carl H. Klaus called A Self Made of Words: Crafting a Distinctive Persona in Nonfiction Writing (University of Iowa Press).

Klaus is professor emeritus of English at the University of Iowa and founder, there, of the Nonfiction Writing Program. He is also a practitioner and critic of the genre of the personal essay, and A Self Made of Words seems largely addressed to the students, formal or otherwise, who want to learn the craft. Scholarly discourse rarely assumes the guise of the personal essay, of course. But Klaus’s insights and advice are not restricted to that literary form, and his book should have a tonic effect on anyone who wants his or her writing to do more than paint gray on gray.

To put it another way, A Self Made of Words doesn't stress writing in the personal voice, but rather the persona that always operates in writing, of whatever variety, whether formal or informal, autobiographical or otherwise.

Klaus wrote an earlier book called The Made-Up Self: Impersonation in the Personal Essay (Iowa), which I have not had a chance to read, but I assume he there goes into the original use of the word persona, meaning, in Latin, a mask, of the stylized kind ancient actors wore on stage to project a character. The author of even the flattest and most objective or empirically minded paper creates or displays a persona while writing: one that is self-effacing and indistinct, yes, but that manifests its authority through self-effacement and the absence of first- and second-person communication.

Impersonality, in other words, implies a persona. So does the introspective voice and intimate tone of a memoirist, with countless shades of formality and casualness, of candor and disguise, possible in between. The persona is not something that stands behind or apart from the written work, though it may seem to do so. The raw material of the persona is language itself -- not just the vocabulary or syntax an author uses, but the differences in intonation that come from using contractions or avoiding them, from the mixture of concrete and abstract terms, and from the balance of long and short words.

Klaus devotes most of the new book to how those elements, among others, combine to create effective writing -- which is, in his words “the result of a complex interaction between our private intentions and the public circumstances of our communication.” It is not a style guide but a course of instruction on the options available to the writer who might otherwise be unable to craft a persona fit to purpose.

Which, alas, is often the case. Michael Billig did not discuss the academic author’s persona in his book on how to write badly and influence tenure committees – at least, not as such. But it is implicit in his argument about how apprentice scholars orient themselves within the peculiar, restricted language-worlds their elders have created while fighting to establish their claims to disciplinary claims.

In effect, they learn how to write by wearing the personae they’ve been given. And there’s nothing wrong with that, in itself; the experience can be instructive. But the pressure to publish (and in quantity!) makes it more economical to rely on a prefabricated writerly persona, stamped out in plastic on an assembly line, rather than to shape one, as Klaus encourages the reader to do.

Editorial Tags: 

Academic Minute: Israeli and Palestinian Identity

In today’s Academic Minute, Atalia Omer of the University of Notre Dame discusses the role of cultural and religious identity in the ongoing Israeli-Palestinian conflict. Learn more about the Academic Minute here.

 

Ad keywords: 

Essay on the impact of adaptive and competency based learning on traditional-age students

Several decades ago – long before the level of technological sophistication we experience today -- I was part of a movement begun by the late Julian Stanley, a psychology professor, and the Johns Hopkins University Center for Talented Youth (CTY) to save academically talented youth from boredom in the schools. The most controversial instrument to rescue them was a pedagogical practice called, rather prosaically, "Diagnostic Testing Followed by Prescriptive Instruction" or, shorthand, “DT>PI.” It was principally applied to the pre-collegiate mathematics curriculum and relied on just a few key assumptions and practices:

1. Students already know something about a subject before they formally study it.

2. Test students before a course begins and then just instruct them on what they don’t know.

3. Test students again when you as the instructor and they as learners believe they have competency in a subject.

4. Move immediately to the next level of instruction.

The DT>PI model was placed in a more generous context with the adaptation of Professor Hal Robertson’s (University of Washington) notion of the Optimal Match. Simply stated, pace and level of instruction should match optimally an individual student’s assessed abilities — with the caveat that those accessing that talent would always try to stretch a student beyond his comfort zone. The Optimal Match theoretically could apply to all students at any level of education.

When I used to speak publicly in a wide variety of settings — at colleges and universities, community colleges, schools, education association meetings, parent gatherings -- about what I thought to be the commonsensical notions of DT>PI and the Optimal Match, the reactions were pronounced and fiercely negative. My colleagues and I were accused of presenting educators with the dissolution of the structured classroom as we knew it then; forcing students unjustifiably to proceed educationally without sufficient instructional guidance; destroying the communal, cooperative imperative of an American education; and, producing social misfits because students would finish academic coursework before the schedule established (rather arbitrarily, I might add) by educational professionals for all students of one age at one time. Parents joined often with educators to decry such imagined alienation and damage to a child’s personality.

And then there was a change in 2013.

There are now two closely related pedagogies -- adaptive learning and competency-based learning -- that are embraced by a growing number in higher education as a viable component of educational reform. The Bill and Melinda Gates Foundation is awarding grants to 18 institutions to experiment with 10 different adaptive learning platforms, and President Obama has expressed support for these innovations and urged easing of regulations to make that possible.

In general, adaptive learning uses data-driven information to design coursework that permits students to proceed educationally at their appropriate pace and level. And competency–based learning allows students to be free of "seat time" and flexibly progress as they demonstrate mastery of academic content.

These definitions, when combined, delineate precisely the key components of DT>PI and that of numerous other experiments in self-paced learning over the last few decades. But now, while the naysayers are still out there, an increasing number of for-profits and nonprofits are turning to adaptive and competency-based learning as a component of the next stage of reform in American education.

Why now? Something must have changed in society to accept self-paced, individualized learning when only decades ago it was roundly rejected on pedagogical, ethical and psychological grounds. Those concerns are clearly not as inviolate as they were only years ago. Answering this question might well provide education reformers with insight into what is now possible — even expected -- from students for the learning platforms of the future.

There are at least three reasons why self-paced learning might be more popular now than it was only a few years ago: technological advances, financial exigency and a new self-profile of the learner.

Advances in technology that rely on advances in data mining and data analytics -- predicting future learning behavior by an individual based upon analysis of thousands of earlier learners — permit now a high ability to track, direct, customize, evaluate and advise student learning at instantaneous speeds. What in previous decades seemed to be an impossible task for a teacher or professor to manage in a single course — diverse learning points among students — is at least now technically feasible.

Many institutions are rather intent to find new strategies that will at once reduce their cost of providing an education. Adaptive and competency-based learning are thought to be such "disruptive" opportunities, although how accompanying data-driven, all-knowing and anticipating, high-touch technologies will reduce dramatically both cost and price (tuition) remains elusive.

And, lastly, students have perhaps finally realized the expectation of the self-esteem movement that has dominated instruction in our nation’s schools for several decades. Students might well now believe that they are the center of all activity — to include education — and that they are both the sole focus and the drivers of learning. All instructional effort exists for the purpose of fulfilling their desires.

This "power shift" makes learners, individually — not teachers or professors -- aggregators of knowledge by and for themselves. Any approach to education that places them at the center of learning activity accommodates their perspective on education. Adaptive and competency-based learning accomplish this masterfully. Self-paced, individually adjusted instruction, enhanced by “big data” technologies that guide student progress “lockstep” in a course and beyond, eliminates distracting elements to the individual control of knowledge. Primary among those distractions for students are faculty with their pesky, seemingly inefficient and irrelevant questions.

And thus, in 2013, what was not acceptable several decades ago is now thought a solution to crisis in American education. A combination of new technologies, financial emergency and a shift in who is at the center and in control of learning has caused this to occur.

But all is not settled. The changing circumstances introduce concerns that did not exist decades ago when students were not the arbiters of their own learning, self-paced instruction was not thought to be a solution for all students in American education but only the academically talented and big data did not exist to mine and anticipate every move in student learning.

A defining element of DT>PI was that students must not just study what is the next logical step in a course, but they must through the exhortations of a teacher or professor attempt to go beyond what was thought statistically possible — they must stretch themselves intellectually at every point. Professor Stanley used to constantly quote the line of the poet Robert Browning that one’s reach must always exceed one’s grasp.

Questions remain whether in the absence of a live instructor exhorting a student who is not necessarily academically acute and motivated, students will extend their reach or settle for statistically generated achievement delivered by an electronic adviser (I am referring here to traditional-aged undergraduates, not working adults who are propelled by substantial motivational factors). Such absence of exhortation could be extremely damaging to the majority of American students who often do not naturally attempt to achieve to the levels of which they are capable without personal mentorship.

And one traces in those who are enthralled with "big data" and "data analytics" for solving the maladies of American education a disturbing belief. Student will achieve through data-enhanced technologies the perfectibility of education — perhaps life itself -- by eliminating all resistance, frustration, indecision, trial and error, chance and expenditure of time. For example, Harvard University social scientist and university professor Gary King is quoted in a May 20, 2013 New Yorker article entitled "Laptop U." as saying, “With enough data over a long period, you could crunch inputs and probabilities and tell students, with a high degree of accuracy, exactly which choices and turns to make to get where they wanted to go in life."

And yet, there is growing commentary that it is precisely the absence of frustration, resistance and associated imperfections in a so-called “Me Generation” and its aftermath that is compromising contemporary students' learning and preparation for a life. By educators blithely accepting students’ assertion of self-determination without legitimate maturing experiences (that will include failure and self-doubt) and by arranging learning electronically so that they will make no wrong decisions, they are granting them little ability to deal with inevitable disappointment and frustration in life.

Students are educated without gaining resilience and that is hardly an education of which a nation can be proud or secure, regardless of the utopian promises of the big data enthusiasts. All this reminds me of a call I received decades ago from an entrepreneur who wanted me to comment on his idea of developing a school basketball court that would have the hoop move electronically with the ball so that no student would ever miss a shot and thus, in his words, "suffer humiliation."

So while I am delighted that self-paced education in the form of adaptive and competency-based learning is finally a more generally discussed component of reform in American education, I urge that those advancing it think long and hard about some of the humanly-damaging consequences of learning platforms so perfected by technology that students are offered a Faustian bargain – the comfort of non-resistant and frustration-free learning in exchange for the ultimate loss of a resilience needed for a satisfying life after schooling.


 

William G. Durden is president emeritus and professor of liberal arts at Dickinson College, and operating partner at Sterling Partners, a private equity company.

Editorial Tags: 
Image Source: 
Getty Images

Academic Minute: Climate and Brain Function

In today’s Academic Minute, Timothy Roth of Franklin and Marshall College explores the link between local climate and brain capacity within wide-ranging species. Learn more about the Academic Minute here.

 

Ad keywords: 

Colleges award tenure

Smart Title: 

The following individuals have recently been awarded tenure by their colleges and universities:

American University

Essay on the real death of the humanities

In all those years I was pursuing a Ph.D. in religious studies, the question of what my profession really stood for rarely came up in conversation with fellow academics, save for occasional moments when the position of the humanities in higher education came under criticism in public discourse. When such moments passed, it was again simply assumed that anyone entering a doctoral program in the humanities knowingly signed on to a traditional career of specialized research and teaching.

But the closer I got to receiving that doctorate, the less certain I became that this was a meaningful goal. I was surrounded by undergraduates who were rich, well-meaning, and largely apathetic to what I learned and taught. I saw my teachers and peers struggle against the tide of general indifference aimed at our discipline and succumb to unhappiness or cynicism. It was heartbreaking.

Fearing that I no longer knew why I studied religion or the humanities at large, I left sunny California for a teaching job at the Asian University for Women, in Chittagong, Bangladesh. My new students came from 12 different countries, and many of them had been brought up in deeply religious households, representing nearly all traditions practiced throughout Asia. They, however, knew about religion only what they had heard from priests, monks, or imams, and did not understand what it meant to study religion from an academic point of view. And that so many of them came from disadvantaged backgrounds convinced me that this position would give me a sense of purpose.

I arrived in Bangladesh prepared to teach an introductory course on the history of Asian religions. But what was meant to be a straightforward comparison of religious traditions around the region quickly slipped from my control and morphed into a terrible mess. I remember an early lesson: When I suggested during a class on religious pilgrimage that a visit to a Muslim saint’s shrine had the potential to constitute worship, it incited a near-riot.

Several Muslim students immediately protested that I was suggesting heresy, citing a Quranic injunction that only Allah should be revered. What I had intended was to point out how similar tension existed in Buddhism over circumambulation of a stupa — an earthen mound containing the relics of an eminent religious figure — since that act could be seen as both remembrance of the deceased’s worthy deeds and veneration of the person. But instead of provoking a thoughtful discussion, my idea of comparative religious studies seemed only to strike students as blasphemous.

Even more memorable, and comical in hindsight, was being urged by the same Muslims in my class to choose one version of Islam among all its sectarian and national variations and declare it the best. Whereas Palestinians pointed to the "bad Arabic" used in the signage of one local site as evidence of Islam’s degeneration in South Asia, a Pakistani would present Afghanis as misguided believers because — she claimed—they probably never read the entire Quran. While Bangladeshis counseled me to ignore Pakistanis from the minority Ismaili sect who claim that God is accessible through all religions, Bangladeshis themselves were ridiculed by other students for not knowing whether they were Sunni or Shi’a, two main branches of Islam. In the midst of all this I thought my call to accept these various manifestations of Islam as intriguing theological propositions went unheeded.

With my early enthusiasm and amusement depleted, I was ready to declare neutral instruction of religion in Bangladesh impossible. But over the course of the semester I could discern one positive effect of our classroom exercise: students’ increasing skepticism toward received wisdom. In becoming comfortable with challenging my explanations and debating competing religious ideas, students came to perceive any view toward religion as more an argument than an indisputable fact. They no longer accepted a truth claim at face value and analyzed its underlying logic in order to evaluate the merit of the argument. They expressed confidence in the notion that a religion could be understood in multiple ways. And all the more remarkable was their implicit decision over time to position themselves as rational thinkers and to define their religions for themselves.

An illustrative encounter took place at the shrine of the city’s most prominent Muslim saint. I, being a man, was the only one among our group to be allowed into the space. My students, the keeper of the door said, could be "impure" — menstruating — and were forbidden to enter. Instead of backing down as the local custom expected, the students ganged up on the sole guard and began a lengthy exposition on the meaning of female impurity in Islam. First they argued that a woman was impure only when she was menstruating and not at other times; they then invoked Allah as the sole witness to their cyclical impurity, a fact the guard could not be privy to and thus should not be able to use against them; and finally they made the case that if other Muslim countries left it up to individual women to decide whether to visit a mosque, it was not up to a Bangladeshi guard to create a different rule concerning entry. Besieged by a half-dozen self-styled female theologians of Islam, the man cowered, and withdrew his ban.

I was incredibly, indescribably proud of them.

Equally poignant was coming face to face with a student who asked me to interpret the will of Allah. Emanating the kind of glow only the truly faithful seem to possess, she sat herself down in my office, fixed the hijab around her round alabaster face, and quietly but measuredly confessed her crime: She had taken to praying at a Hindu temple because most local mosques did not have space for women, and she was both puzzled and elated that even in a non-Islamic space she could still sense the same divine presence she had been familiar with all her life as Allah. She asked for my guidance in resolving her crisis of faith. If other Muslims knew about her routine excursions to a Hindu temple, she would be branded an apostate, but did I think that her instinct was right, and that perhaps it was possible for Allah to communicate his existence through a temple belonging to another religion?

In the privacy of my office, I felt honored by her question. I had lectured on that very topic just before this meeting, arguing that sacred space was not the monopoly of any one religion, but could be seen as a construct contingent upon the presence of several key characteristics. This simple idea, which scholars often take for granted, had struck her as a novel but convincing explanation for her visceral experience of the Islamic divine inside a Hindu holy space. Though she had come asking for my approval of her newly found conviction, it was clear that she did not need anyone’s blessing to claim redemption. Humanistic learning had already provided her with a framework under which her religious experience could be made meaningful and righteous, regardless of what others might say.

And thanks to her and other students, I could at last define my own discipline with confidence I had until then lacked: The humanities is not just about disseminating facts or teaching interpretive skills or making a living; it is about taking a very public stance that above the specifics of widely divergent human ideas exist more important, universally applicable ideals of truth and freedom. In acknowledging this I was supremely grateful for the rare privilege I enjoyed as a teacher, having heard friends and colleagues elsewhere bemoan the difficulty of finding a meaningful career as humanists in a world constantly questioning the value of our discipline. I was humbled to be able to see, by moving to Bangladesh, that humanistic learning was not as dispensable as many charge.

But before I could fully savor the discovery that what I did actually mattered, my faith in the humanities was again put to a test when a major scandal befell my institution. I knew that as a member of this community I had to critique what was happening after all my posturing before students about the importance of seeking truth. If I remained silent, it would amount to a betrayal of my students and a discredit to my recent conclusion that humanistic endeavor is meant to make us not only better thinkers, but also more empowered and virtuous human beings.

So it was all the more crushing to be told to say nothing by the people in my very profession, whose purpose I thought I had finally ascertained. In private chats my friends and mentors in academe saw only the urgent need for me to extricate myself for the sake of my career, but had little to say about how to address the situation. Several of my colleagues on the faculty, though wonderful as individuals, demurred from taking a stance for fear of being targeted by the administration for retribution or losing the professional and financial benefits they enjoyed. And the worst blow, more so than the scandal itself, was consulting the one man I respected more than anybody else, a brilliant tenured scholar who chairs his own department at a research university in North America, and receiving this one-liner:

"My advice would be to leave it alone."

It was simultaneously flummoxing and devastating to hear a humanist say that when called to think about the real-life implications of our discipline, we should resort to inaction. And soon it enraged me that the same people who decry the dismantling of traditional academe under market pressure and changing attitudes toward higher education could be so indifferent, thereby silently but surely contributing to the collapse of humanists’ already tenuous legitimacy as public intellectuals.

While my kind did nothing of consequence, it was the students — the same students whom I had once dismissed as incapable of intellectual growth — who tried to speak up at the risk of jeopardizing the only educational opportunity they had. They approached the governing boards, the administration, and the faculty to hold an official dialogue. They considered staging a street protest. And finally, they gave up and succumbed to cynicism about higher education and the world, seeing many of their professors do nothing to live by the principles taught in class, and recognizing the humanities as exquisitely crafted words utterly devoid of substance.

As my feeling about my discipline shifted from profound grief to ecstatic revelation to acute disappointment, I was able to recall a sentiment expressed by one of my professors, who himself might not remember it after all these years. Once upon a time we sat sipping espresso on a verdant lawn not far from the main library, and he mused that he never understood why young people no longer seemed to feel outrage at the sight of injustice. He is a product of a generation that once once rampaged campuses and braved oppression by the Man. On first hearing his indictment, I was embarrassed to have failed the moral standard established by the older generation of scholars like him. But now I see that it is not just young people but much of our discipline, both young and old, that at present suffers from moral inertia. With only a few exceptions, humanists I know do not consider enactment of virtue to be their primary professional objective, whether because of the more important business of knowledge production or material exigencies of life. And I can only conclude, with no small amount of sadness, that most humanists are not, nor do they care to be, exemplary human beings.

Maybe I should move on, as did a friend and former academic who believes that the only people we can trust to stand on principle are "holy men, artists, poets, and hobos," because yes, it is true that humanists should not be confused with saints. But the humanities will always appear irrelevant as long as its practitioners refrain from demonstrating a tangible link between what they preach and how they behave. In light of the current academic penchant for blaming others for undoing the humanities, it must be said that humanists as a collective should look at themselves first, and feel shame that there is so much they can say — need to say — about the world, but that they say so little at their own expense.

After a year and a half in Bangladesh, I do not doubt any longer that the humanities matters, but now I know that the discipline’s raison d’être dies at the hands of those humanists who do not deserve their name.

Se-Woong Koo earned his Ph.D. in religious studies from Stanford University in 2011. He currently serves as Rice Family Foundation Visiting Fellow and Lecturer at Yale University.

Essay on how to be a good mentor for academic jobs

Category: 
Get a Job!

Cheryl E. Ball considers how faculty in Ph.D. programs need to be engaged in helping their grad students launch careers.

Ad keywords: 
Show on Jobs site: 

UCLA Releases Report on Grievances of Minority Faculty

Many minority faculty members at the University of California at Los Angeles feel that they encounter bias and insensitivity regularly, and that the university is not necessarily committed to resolving their concerns, says a report released by the university last week.  The report was prepared by Carlos Moreno, a former justice of the California Supreme Court, who was assisted by lawyers so that minority faculty members could discuss their concerns without fear of hurting their careers. The report says that  "we found widespread concern among faculty members that the racial climate at UCLA had deteriorated over time, and that the university’s policies and procedures are inadequate to respond to reports of incidents of bias and discrimination. Our investigation found that the relevant university policies were vague, the remedial procedures difficult to access, and from a practical standpoint, essentially nonexistent."

Gene D. Block, chancellor at UCLA, announced in response to the report the creation of a new position, a full-time discrimination officer, and he pledged further policies to make UCLA welcoming for all professors. "Our campus can and must do a better job of responding to faculty reports of racial and ethnic bias and discrimination and take steps to prevent such incidents from ever occurring," said Block in an e-mail message to the campus. "It is one thing to talk about our commitment to diversity and creating a welcoming campus; it is quite another to live up to those ideals. Rhetoric is no substitute for action. We must set an example for our students. We cannot tolerate bias, in any form, at UCLA. I sincerely regret any occasions in the past in which we have fallen short of our responsibility."

Ad keywords: 

Study analyzes impact of academic 'redshirting' on earning a Ph.D.

Smart Title: 

Does starting children in kindergarten when they are older increase odds that they will earn a doctorate?

Educause attendees urged to focus on higher education's needs in copyright debates

Smart Title: 

Higher education leaders urged to be more engaged -- and less reactive -- on making sure the law reflects "fair use" needs of students and faculty members.

Pages

Subscribe to RSS - faculty
Back to Top