teachinglearning

Museum Will Offer M.A. for Science Teachers

The American Museum of Natural History, in New York City, is starting a 15-month master of arts program in teaching to train earth science teachers, The New York Times reported. Tuition will be free and students will receive $30,000 stipends and health insurance.

Ad keywords: 

Report Explores Social Science Role in Medical Education

Behavioral and social sciences play a key role in health issues and need to play a key role in the medical school curriculum, according to a report released Thursday by the Association of American Medical Colleges. The report notes that behaviors and the social determinants of health -- such as smoking, diet, exercise, and socioeconomic status -- account for more than 50 percent of premature disease and death in the United States.

 

Ad keywords: 

Essay: How free speech and offensive art can exist on college campuses

A fairly typical art school event: Students submit two- or three-dimensional artwork to fulfill a class assignment, or for a school- or department-wide exhibition. Perhaps less typical: At a West Coast independent art college some years back, with no advance warning, a student killed a chicken in class as his project. As one might guess, this was not a class in animal slaughtering, and the school did its best not to criminalize the student’s action (“Why make a big deal out of it?” the school’s then-president said), but some faculty members did give the student a talking-to.

Tensions over the nature of classroom content can occur in many disciplines, and professors in science, business or humanities courses get little if any training, too, in how to handle difficult situations. But the problem is arguably greatest of all in the fine arts, where a culture of epater la bourgeoisie – shock the middle-class, afflict the comfortable – has existed for 150 or so years. Art is supposed to get people to see the world in new and different ways, but what if that awakening is rude or employs violence or obscenity or blasphemy or something else that may cause offense? It may be assumed that art schools and universities are bastions of free speech and experimentation but, Lord, not in my classroom! 

Art instructors tend to plan for the typical, preparing lessons and critiques, but they sometimes get the atypical, because art students occasionally look to shock/provoke/offend/transgress. Few enough of these instructors receive any pedagogical training, and the little they do learn concerns classroom management, organizing a syllabus, how to grade students and lead discussions. This essay aims to explore the issues that can arise and to suggest ways that institutions and instructors can be better prepared for what can unfold in art classrooms.

In the thick student and faculty handbook at the Maryland Institute College of Art, for example, there are pages devoted to limiting certain types of art speech – graffiti art on public property is “vandalism,” animals must be treated “in a humane manner when used in/as art work,” no setting off fireworks, displaying or using weapons, possession or use of illegal drugs or alcohol, no exposing others to “blood, urine, feces, chemicals or other hazardous materials” – and the prohibitions were recently expanded to include the more nebulous “works that involve physical/emotional stress (potential or real) to the artist and/or audience.”

“You think you’ve covered all the bases and then someone comes up with something new,” said Ray Allen, the college’s provost. One student’s art project was to attach a commentary on sexual abuse by priests on a nearby church door, which led to another addition to the handbook, prohibiting the placement of “artwork” “on Corpus Christi Church or church property.”

The exposure to “blood, urine, feces…” section somewhat applies to the actions of the photography student who invited several men into the school so that she could take pictures of them at the moment of climax, but that prohibition covers only what Allen called the “ejaculants.” The fact that the student brought strange men onto school property was a separate matter. And, of course, where do you include a section on a student not locking himself in a box filled with snow (when he was finally pried out of the box, the student was unconscious and suffering from hypothermia)?

Walking a thin line between encouraging free artistic expression and what Ron Jones, president of the Memphis College of Art, calls “the rights of others not to be exposed to what they do not accept,” is a learned skill, and what one learns may apply only to a particular college, because each may have more or less tolerance of students with a desire to generate outrage. Artists, like so many others, are First Amendment absolutists when it comes to themselves, but in the context of a classroom or a college with a diverse student body or a publicly supported university where some state legislator may use a challenging art exhibit as the basis for a campaign to reduce governmental funding, they may question their moral footing.

At times, for example a student art project may involve, or suggest, violence, such as the performance piece staged at the University of California at Los Angeles in 2005 in which an MFA student pulled out of a bag what appeared to be a handgun, loaded it with a single bullet, spun the cylinder and aimed the pistol at his head, pulling the trigger. (The gun failed to discharge, and the student received a talking-to.) At other times, the issue is sexual content, such as the photograph of a male nude by a Savannah College of Art & Design student, which was removed by college administrators from the school’s Open Studio Exhibition in late 2010, because the image was “unacceptable” for a “family event.”

Politics in art also may make people uncomfortable, such as the pair of portraits of former President George W. Bush and Vice President Dick Cheney framed with actual American flags exhibited at the student gallery by a graduate student at California’s Laguna College of Art & Design in in 2010. “A staff member took offense and complained to the president and development director, who were initially opposed to showing this work,” said Perin Mahler, chair of the college’s Master of Fine Arts program. (After a considerable amount of debate, the portraits were permitted to stay in the show.)

Religion is no easier a subject, as the Brigham Young University alumnus Jon McNaughton found when his painting “One Nation Under God,” depicting Christ holding the U.S. constitution and standing among the nation’s founding fathers, was removed from an exhibition space at the BYU bookstore in the summer of 2010. (One faculty member complained that McNaughton’s artwork should not be displayed unless an alternative liberal painting was also hung, and one of the university’s vice presidents declared himself “uncomfortable” with the presence of “One Nation Under God.”)

There are no rules of the road to help art instructors and college administrators in this realm. History professors (I hope) would know that it would be reprehensible and illegal if a student in a Revolutionary War class brought in a musket and began firing it, but art faculty seem immobilized by the term “freedom of expression.” Maintaining standards and order is not reactionary ("the critics hated the impressionists, too!") but helps students learn larger lessons of propriety.

Throughout a long career, most college studio art instructors will have students who look to test limits of taste and propriety, and some faculty will have these students sooner than others. Some tricks of the trade of teaching are learned on the job, but instructors need to have a firm idea from day one not only about how to educate and guide their students, but how to explore their ideas and materials, “but also to understand that there is a responsibility that goes along with that freedom,” said Kevin Conlon, vice president of academic affairs at the Columbus College of Art and Design. “We have a cultural value at this school that respects tolerance and diversity, and artwork that borders on hate speech requires us as faculty to help students understand the context of what they’re doing.”

He recalled one student who produced a painting that was based on photographs from pornography magazines, “which I knew was going to make many of the women in the class uncomfortable. I sat down with this student and asked him, ‘Why are you doing this? What do you think the effect of these images will be on other people?’ He really hadn’t thought much about it and had nothing to say. I told him, ‘If you can come up with a reason for what you’re doing, we can go forward.’ ”

"Going forward” is a pretty vague concept, but there are ways that potentially offensive student (or faculty or non-faculty, for that matter) artwork may be exhibited in good conscience. There is usually more than one gallery space on campuses in which pieces may be displayed, some more open to the public than others, and school hallways also may be the site of temporary exhibits. Some exhibitions have advisory signs that warn prospective visitors of challenging content, giving them the choice – and making them party to the decision – not to see something. Finally, potentially offensive artwork may be edited out for reasons of space rather than content. Censorship (if we are allowed to use that word) may take place along a continuum.

One would assume that even the most novice instructor understands that cruelty to animals or humans should not be permitted and that use of bodily fluids or hazardous products creates safety issues that need to be checked, but that’s not a given. Schools look to hire young instructors, because they are expected to form strong connections to students to whom they are closer in age than older, more experienced teachers. “Very often, younger faculty pride themselves on getting their students excited about an art project, and they lose what you would think would be common sense,” Allen said. That photography student at the Maryland Institute College of Art received approval from her young faculty adviser for both allowing the men on campus and their production of semen (that instructor was later reprimanded by the school’s administration on both counts), and the instructor of the student who locked himself in a box of snow and was eventually pulled out unconscious “didn’t have the sense to call 911,” he noted. (Another talking-to.)

What to do about an artwork that is likely to produce strong reactions in those who experience it is a toss-up, which is why schools resort to the less accessible galleries or warning signs. When in doubt, according to William Barrett, executive director of the Association of Independent Colleges of Art and Design, instructors should bring problems and situations to their department chairs in order to receive guidance and support, which may be necessary in the event that a student’s artwork becomes a matter of significant controversy. Talking-tos are O.K., but they tend to take place after the fact, whereas instructors need to be more alert to what might happen and be ready.

Solutions?

Student artwork that may seem in poor taste or just disgusting has often elicited solutions that are even less appetizing. Many school administrators and faculty will try to talk students out of exhibiting works that are likely to engender controversy. However, from the eyes of a 19- or 22-year-old, this meeting is unlikely to be seen as a value-free discussion; students will see it as a directive from the people who give them grades and on whom they may rely for recommendations to make some change or do something different. Faculty may be worried about their own job security, and administrators may be fearful of criticism from trustees or groups in the community or the press, and their part of the conversation is apt to show that stress.

Buffeted by the calls for almost limitless free speech and the potential black eye that negative publicity over controversial artwork may create, schools and universities tend to establish few rules about what is unacceptable, but many are also reluctant to fully support their students and faculty in the event of complaints over the content of the art. Younger faculty especially may worry about promotions and tenure if their classroom work results in controversy that requires administrators to defend artwork that strikes the public as insulting to one group or another or as offensive to common standards of good taste.

Some schools are offering “artist as citizen” courses that view the role art plays in the general society. Artists must learn to be responsible to the community and culture they live in – so goes the thinking. Again, something is very troubling about this development: Were the purpose of this type of class to broaden the student’s intellectual outlook, there would be no complaint; however, half of the curriculum for art students already consists of liberal arts courses, which should provide that broadening experience. If schools want to offer an elective on art controversies in history or 20th-century art controversies or even art controversies of the past decade, that would be perfectly valid. Where it moves from an analysis of art in the social milieu to how artists are to behave and think about their audience and be sensitive to group members of that audience (which I think these courses are really about), then the educational component is left behind and the political correctness element enters in. In fact, it becomes a course in political correctness.

Then, there is the question of whether some groups are more acceptable to attack or parody than others. One art instructor at a state university proudly spoke to me of her efforts defending to school administrators a woman student whose artwork included a painting of Virgin Mary using a crucifix as a sex toy. I asked her if she would have made as strong a case if the student were male and the imagery was arguably misogynistic or Neo-Nazi. “Absolutely not!” she said. Political correctness meets comparative victimhood.

Other solutions are a toss-up. Making artwork less accessible by exhibiting it in a less public, harder-to-find space skirts the boundaries of censorship, and parental advisory signs about the content of works at the front of an exhibit removes the surprise element from art.

Publicly supported institutions, such as universities, and particularly those located in rural and traditionally conservative areas of the country, are more likely (but not always) the focus of controversy than private and more urban colleges and art schools. Private schools are answerable only to their trustees and immediate community, whereas public institutions additionally may be condemned by citizens’ groups and legislators for spending taxpayer money on blasphemy, homoeroticism, pornography, racism or something else to which they object. Obscenity laws, for which "community standards" establish a legal basis of judgment, have not been applied to schools, and it certainly is not clear who or what the community is: other students, the entire campus, the entire campus plus the surrounding community? For a state-supported institution, the community may be the entire state, plus out-of-state students (and parents), alumni, businesses, foundations and government agencies that provide the operating budget of the school.

Better Solutions?

Students are generally young, generally inexperienced about the world, and it may make sense not to put their work up for display so much. That is not censorship but based on pedagogical theory: Student work should be seen as part of their artistic development, a process and not a product to be exhibited and defended. The effort to get students to rework their pieces and rethink their ideas would be less fraught with anxiety if exhibitions were not part of the issue.

The student’s world is often a circumscribed, cloistered one, existing almost completely within the confines of the school, and the intellectual parameters are defined by teachers and fellow students. The work that is created tends to reflect the culture of the school, because students have a very limited sense of what actually is exhibited and sold in art galleries. It is a good thing for students to be “out of the market” and in an educational setting where they may develop artistic skills, ideas and a sense of process, but they should not become out of touch with how the real world works. Were art students more out in the world – directed there through internships, externships, mentoring relationships with full-time artists, employment in the art world and visits to galleries and museums – they might quickly recognize that simply being provocative carries no weight in the arena that they look to enter.

Daniel Grant is the author of several books published by Allworth Press, including The Business of Being an Artist and The Fine Artist's Career Guide. He has taught at Lyme Academy College of Fine Arts and has a blog on the arts page of the Huffington Post.
 
 

Essay urges reforms for doctoral education in humanities

Not all doctorate recipients will become faculty members, but all future faculty will come out of graduate programs. Do these programs serve the needs of graduate students well?

In light of the rate of educational debt carried by humanities doctoral recipients, twice that of their peers in sciences or engineering; in light of the lengthy time to degree in the humanities, reaching more than nine years; and in light of the dearth of opportunities on the job market, the system needs to be changed significantly. I want to begin to sketch out an agenda for reform.

The major problem on all of our minds is the job market, the lack of sufficient tenure-track openings for recent doctorate recipients. One response I have heard is the call to reduce the flow of new applicants for jobs by limiting access to advanced study in the humanities. If we prevent some students from pursuing graduate study — so the argument goes — we will protect the job market for others. I disagree.  

Let us not lose sight of the fact that the number of new Ph.D.s has already declined significantly, down about 10 percent from a recent peak in the 1990s. Because that drop hardly matches the 32 percent decline in job listings since 2007-08, the problem is not too many scholars: it is too few tenure-track positions. I fear that any call to reduce doctoral programs will end up limiting accessibility and diversity, while playing into the hands of budget-cutters. U.S. education needs more teaching in our fields, not less, and therefore more teaching positions, the real shovel-ready jobs.

Instead of asking that you lock your doors behind the last class of admitted students,  I appeal to those of you involved in the structure of doctoral programs to consider how to keep them open by making them  more affordable and therefore more accessible. Can we redesign graduate student learning in the face of our changed circumstances?

Reform has to go to the core structures of our programs. Let me share two pertinent experiences at Stanford.

Thanks to a seed grant from the Teagle Foundation, I was able to experiment with a program for collaborative faculty-graduate student teaching. In our umbrella grouping of the language departments, we set up small teams — one faculty member and two graduate students from each language — to develop and deliver undergraduate courses, against the backdrop of a common reading group on current scholarship on student learning and other issues in higher education. The graduate students developed their profiles as teachers of undergraduate liberal arts. Teaching experience is only going to grow more significant as a criterion in hiring, and we should, in our departments, explore how to transform our programs to prepare students better as future humanities teachers of undergraduates. I encourage all departments to experiment with new modalities of collaborative graduate student-faculty teaching arrangements that are precisely not traditional TA arrangements.

Support from Stanford's Center for Teaching and Learning has led to an ad hoc project on "Assessing Graduate Education," a twice-a-quarter discussion group to which all faculty and graduate students have been invited. German studies graduate student Stacy Hartman organized an excellent survey of best practices, which has become the center of a vigorous discussion. My point now is not to dwell on the particular issues — teaching opportunities, examination sequencing, quality of advising, professionalization opportunities, etc. — but to showcase the potential in every department of a structured public discussion forum on the character of doctoral training. I advise all doctoral programs to initiate similar discussions, not limited to members of departmental standing committees but open to all faculty and graduate students. What works in our programs; what could be better?  

At nine years (according to the Survey of Earned Doctorates), time to degree in our fields is excessive. We should try to cut that in half. I call on all departments with doctoral programs to scrutinize the hurdles in the prescribed trajectories: are there unnecessary impediments to student progress? Is the sequencing of examinations still useful for students?

Accelerating progress to completion will, moreover, depend on better curriculum planning and course articulation, as former MLA President Gerald Graff emphasized in his convention address three years ago. We should plan course offerings with reference to student learning needs. Curricular and extracurricular professionalization opportunities could take into account the multiple career tracks that doctorate recipients in fact pursue — this means the real diversity of hiring institutions, the working conditions of faculty at different kinds of institutions, non-teaching careers in the academy as well as non-academic positions. Can we prepare students better for all of these outcomes? Finally, we have to reinvent the conclusion of doctoral study.  As last year's President Sidonie Smith reminds us, the dissertation, as a proto-book, need not remain the exclusive model for the capstone project. This piece is crucial to the reform agenda.
 

Russell A. Berman is professor of comparative literature and German studies at Stanford University. This essay is an except from his presidential address at the 2012 meeting of the Modern Language Association.

Essay on new approach to defend the value of the humanities

"When the going gets tough, the tough take accounting." With those succinct words in a June 2010 op ed, New York Times columnist David Brooks summed up the conventional wisdom on the current crisis of the humanities. In an age when a higher education is increasingly about moving quickly through a curriculum streamlined to prepare students for a job, the humanities have no practical utility. As Brooks observes, "when the job market worsens, many students figure they can’t indulge in an English or a history major," a fact that explains why the "humanities now play bit roles when prospective students take their college tours. The labs are more glamorous than the libraries."

Pushed into a corner by these dismaying developments, defenders of the humanities -- both traditionalists and revisionists — have lately been pushing back. Traditionalists argue that emphasizing professional skills would betray the humanities' responsibility to honor the great monuments of culture for their own sake. Revisionists, on the other hand, argue that emphasizing the practical skills of analysis and communication that the humanities develop would represent a sellout, making the humanities complicit with dominant social values and ideologies. But though these rival factions agree on little else, both end up concluding that the humanities should resist our culture's increasing fixation on a practical, utilitarian education. Both complain that the purpose of higher education has been reduced to credentialing students for the marketplace.

Martha Nussbaum, for example, while stressing that the humanities foster critical thinking and the ability to sympathetically imagine the predicament of others, insists such skills are, as the title of her 2010 book puts it, "not for profit." In doing so she draws a stark line between the worlds of the humanities and the 21st-century workplace. Likewise, Geoffrey Galt Harpham in The Humanities and the Dream of America, laments the increasing focus on professional skills in the humanities at the expense of reading great books. Stanley Fish takes an even more extreme position, insisting that the humanities "don’t do anything, if by 'do' is meant bring about effects in the world. And if they don’t bring about effects in the world they cannot be justified except in relation to the pleasure they give to those who enjoy them. To the question 'of what use are the humanities?', the only honest answer is none whatsoever." Worse still, Frank Donoghue, in The Last Professors: The Corporate University and the Fate of the Humanities, argues that the humanities will simply disappear in the new corporate, vocation-centered university.

Ironically, these pessimistic assessments are appearing at the very moment when many employers outside academe are recognizing the practical value of humanities training. Fish simply dismisses the argument that "the humanities contribute to the economic health of the state — by producing more well-rounded workers or attracting corporations or delivering some other attenuated benefit — because nobody really buys that argument." But this would come as news to the many heads of philanthropic foundations, nonprofits, and corporate CEOs who have lately been extolling the professional value of workplace skills grounded in the humanities.

We would be the last to argue that traditional ways of valuing the humanities are not important, that studying philosophy, literature, and the fine arts do not have a value in and of themselves apart from the skills they teach. We also recognize that the interests of the corporate world and the marketplace often clash with the values of the humanities. What is needed for the humanities in our view is neither an uncritical surrender to the market nor a disdainful refusal to be sullied by it, but what we might call a critical vocationalism, an attitude that is receptive to taking advantage of opportunities in the private and public sectors for humanities graduates that enable those graduates to apply their training in meaningful and satisfying ways. We believe such opportunities do exist.

To be sure, such optimism must be tempered in today’s bleak economy, where hardly any form of education is a sure ticket to a job and where many in the private sector may still look with indifference or even disdain on a humanities degree.  But as David Brooks himself went on to point out in his op-ed: "Studying the humanities improves your ability to read and write. No matter what you do in life, you will have a huge advantage if you can read a paragraph and discern its meaning (a rarer talent than you might suppose). You will have enormous power if you are the person in the office who can write a clear and concise memo."

Brooks’ view is echoed by Edward B. Rust Jr., chairman and CEO of State Farm Insurance Companies, who observes that "at State Farm, our employment exam does not test applicants on their knowledge of finance or the insurance business, but it does require them to demonstrate critical thinking skills" and "the ability to read for information, to communicate and write effectively, and to have an understanding of global integration." And then there is Google, which more than any other company has sung the praises of humanities students and intends to recruit many of them. "We are going through a period of unbelievable growth," reports Google’s Marissa Mayer, "and will be hiring about 6,000 people this year — and probably 4,000-5,000 from the humanities or liberal arts."

This evidence of the professional utility of humanities skills belies Donoghue’s apparent assumption (in The Last Professors) that the "the corporate world’s hostility" toward humanistic education remains as intense today as it was a century ago, when industrialists like Andrew Carnegie dismissed such an education as "literally, worthless." Donoghue ignores changes in the global economy, the culture, and the humanities themselves since Carnegie’s day that have given many corporate leaders a dramatically more favorable view of the humanities’ usefulness. Associate Dean Scott Sprenger of Brigham Young University, who oversees a humanities program we will discuss in a moment, quotes the dean of the Rotman School of Management in Toronto, who observes a "tectonic shift for business school leaders," who are now aware that "learning to think critically — how to imaginatively frame questions and consider multiple perspectives — has historically been associated with a liberal arts education, not a business school curriculum."

All of these commentators are right, and the skills they call attention to only begin to identify the range of useful professional competencies with which a humanities education equips 21st-century students. In addition to learning to read carefully and to write concisely, humanities students are trained in fields like rhetoric and composition, literary criticism and critical theory, philosophy, history, and theology to analyze and make arguments in imaginative ways, to confront ambiguity, and to reflect skeptically about received truths, skills that are increasingly sought for in upper management positions in today’s information-based economy. Even more important for operating as global citizens in a transnational marketplace, studying the literary, philosophical, historical, and theological texts of diverse cultures teaches humanities students to put themselves in the shoes of people who see and experience the world very differently from their own accustomed perspectives. Are some corporations still looking for employees who will be well-behaved, compliantly bureaucratized cogs in the wheel? Of course they are. But increasingly, many others are looking for employees who are willing to think outside the box and challenge orthodoxy.

It is true that humanities study, unlike technical training in, say, carpentry or bookkeeping, prepares students not for any specific occupation, but for an unpredictable variety of occupations. But as many before us have rightly pointed out, in an unpredictable marketplace this kind of versatility is actually an advantage. As Associate Dean Sprenger notes, "the usefulness of the humanities" paradoxically "derives precisely from their detachment from any immediate or particular utility. Experts tell us that the industry-specific knowledge of a typical vocational education is exhausted within a few years," if not "by the time students enter the workforce." It is no accident, he observes, "that a large percentage of people running Fortune 500 companies (one study says up to 40 percent) are liberal arts graduates; they advance more rapidly into mid- and senior-level management positions, and their earning power tends to rise more significantly than people with only technical training."

If there is a crisis in the humanities, then, it stems less from their inherent lack of practical utility than from our humanistic disdain for such utility, which too often prevents us from taking advantage of the vocational opportunities presented to us. This lofty disdain for the market has thwarted the success of the few programs that have recognized that humanities graduates have much to offer the worlds of business, technology, arts agencies, and philanthropic foundations.

The most promising of these was a program in the 1990s developed by the Woodrow Wilson National Fellowship Foundation under the leadership of its then-director, Robert Weisbuch. First called "Unleashing the Humanities" and later "The Humanities at Work," the program, according to Weisbuch in an e-mail correspondence with the authors, "had persuaded 40 nonprofits and for-profits to reserve meaningful positions for Ph.D. graduates in the humanities and had placed a large number in well-paying and interesting positions — at places ranging from Verizon to AT Kearney to The Wall Street Journal to the National Parks Service." Unfortunately, Weisbuch reports, only a few humanities graduate programs "enlisted their alumni and the influential corporations and others in their areas of influence to revolutionize the possibilities for employment of humanities doctoral graduates," while most faculty members "continued to expect their graduate students to look for jobs much like their own and to consider any other outcome a failure."

Today, however, some humanities programs that emphasize useful professional applications are prospering.  One of these is a new undergraduate major at Brigham Young University called "Humanities +," with the "+” referring to the value-added vocational component gained by students who elect the program. According to an e-mail to the authors from Associate Dean Sprenger, BYU hired a career services specialist tasked with "revolutionizing our humanities advising office along the lines of the Humanities + vision, and the program has developed ties with the university’s colleges of business and management" —  a virtually unheard-of step for a humanities program. The program’s students are urged "to minor in a practical subject, professionalize their language skills, and internationalize their profile by doing an overseas internship." The objective, Sprenger says, "is that career thinking and strategizing become second nature to students," while faculty members "see it as in their interest to help students find 'alternative' careers, and are reassured that they can rely on our advising office to be informed and to do the training."

Another notable program that sees its mission as "bringing humanities into the world" beyond academe and that works closely with its university’s office of career placement is the Master of Arts Program in the Humanities (MAPH) at the University of Chicago, which Gerald helped design and direct in the 1990s. According to a recent article by program associate A.J. Aronstein in Tableau, a University of Chicago house journal, one recent MAPH graduate got a job as finance director in Florida for Barack Obama’s 2008 campaign, later served as chief of staff at the International Trade Association, and now works as a political consultant in Washington. Other MAPH graduates have gone on to internships and subsequent positions as museum curators, technical writers, journalists and other media workers, marketing specialists, and policy analysts with investment firms.

The false assumption in both anti-utilitarian defenses of the humanities and pessimistic predictions of their extinction is that we have to choose between a credentializing and a humanizing view of higher education, between vocational utility and high-minded study as an end in itself. This either/or way of thinking about the humanities — either they exist solely for their own sake or they have no justification at all – is a trap that leaves humanists unable to argue for the value of their work in terms of the practical skills it teaches, an argument that inevitably has to be made in the changing marketplace of higher education. In fact, we would argue there is no defense of the humanities that is not ultimately based on the useful skills it teaches.

Evidence is plentiful that stressing the range of expertise  humanities graduates have makes intellectual and economic sense. Take, for example, Damon Horowitz, director of engineering at Google. He insisted recently in an article in The Chronicle of Higher Education entitled "From Technologist to Philosopher: Why You Should Quit Your Technology Job and Get a Ph.D. in the Humanities," that "if you are worried about your career ... getting a humanities Ph.D. is not only not a danger to your employability, it is quite the opposite. I believe there no surer path to leaping dramatically forward in your career than to earn a Ph.D. in the humanities." "You go into the humanities to pursue your intellectual passion," he explains, "and it just so happens, as a byproduct, that you emerge as a desired commodity for industry."

Horowitz, a leading figure in artificial intelligence and the head of a number of tech startups, ought to know. He took a break from his lucrative career to enroll in Stanford’s Ph.D. program in philosophy because he figured out that in order to do his job in technology well he needed to immerse himself in the humanities. "I realized that, while I had set out in AI to build a better thinker, all I had really done was to create a bunch of clever toys." Horowitz came to realize that the questions he was "asking were philosophical questions — about the nature of thought, the structure of language, the grounds of meaning." Returning to the humanities, Horowitz took time out from the world of artificial intelligence to study "radically different approaches to exploring thought and language," such as philosophy, rhetoric, hermeneutics and literary theory. As he studied intelligence from these perspectives he "realized just how limited my technologist view of thought and language was. I learned how the quantifiable, individualistic, ahistorical — that is, computational — view I had of cognition failed to account for whole expanses of cognitive experience (including, say, most of Shakespeare)."

The concrete value of the humanities education Horowitz celebrates is especially well epitomized in the new field of the digital humanities. The emergence of this field calls attention to how old 20th-century divisions between science and the humanities are breaking down and gives those of us committed to defending the practical value of the humanities a tremendous opportunity. The digital humanities represent the cutting-edge intersection of the humanities and computer science, the merging of skills and points of view from two formerly very different fields that are leading to a host of exciting innovations – and opportunities for students who want to enter fields related to everything from writing computer programs to text encoding and text editing, electronic publishing, interface design, and archive construction. Students in the digital humanities are trained to deal with concrete issues related to intellectual property and privacy, and with questions related to public access and methods of text preservation.

Graduates of the digital humanities programs that are now developing all over the country will be first in line for such positions. For example, Paul’s university now has a Digital Humanities M.A. with two converging tracks, one designed for students with a background in computer science and one for students with a background in the humanities. The program website notes that it offers "theoretical, critical, social, and ethical contexts for thinking about the making of new knowledge through digital humanities research and applications, from issues of intellectual property and privacy, to questions of public access and methods of preservation." When we are asked about the practical value of a humanities education, we need to add the digital humanities to the list.

We believe it is time to stop the ritualized lamentation over the crisis in the humanities and get on with the task of making them relevant in the 21st century.  Such lamentation only reveals the inability of many humanists to break free of a 19th-century vision of education that sees the humanities as an escape from the world of business and science. As Cathy Davidson has forcefully argued in her new book, Now You See It, this outmoded way of thinking about the humanities as a realm of high-minded cultivation and pleasure in which students contemplate the meaning of life is a relic of the industrial revolution with its crude dualism of lofty spiritual art vs. mechanized smoking factories, a way of thinking that will serve students poorly in meeting the challenges of the 21st century.

Though we have argued in defense of the practical and vocational utility of a humanities education, our argument should in no way be construed as undercutting the aspirations of those in the humanities who seek an academic career. Indeed, on this score we need to redouble our efforts to increase public and private funding for higher education and to support unionizing efforts by faculty members and adjuncts. But even as we fight these battles to expand the academic job market we would be foolish to turn our backs on alternative forms of employment for humanities graduates when they are out there. In this spirit, we applaud both Modern Language Association President Russell Berman and American Historical Association President Anthony Grafton, who, along with the executive director of the AHA, James Grossman, have recently urged their organizations to acknowledge that advanced training in the humanities can lead to a variety of careers beyond academia and have suggested how graduate programs can be adapted for these kinds of careers.

For ultimately, to take advantage of the vocational potential of humanities study as we propose is not to sell out to the corporate world, but to bring the critical perspective of the humanities into that world. It is a perspective that is sorely needed, especially in corporate and financial sectors that have lately been notoriously challenged in the ethics department, to say the least. Humanities graduates are trained to consider the ethical dimensions of experience, linking the humanities with the sciences as well as with business and looking at both these realms from diverse perspectives. To those who worry that what we urge would blunt the humanities' critical power, we would reply that it would actually figure to increase that power, for power after all is the ability to act in the world.

Paul Jay is professor of English at Loyola University Chicago and the author, most recently, of Global Matters: The Transnational Turn in Literary Studies. Gerald Graff is professor of English and education at the University of Illinois at Chicago and a past president of the Modern Language Association.

 

Canadian Universities Explore 'Block Plans'

The "block plan," in which students take one course at a time for a few weeks, rather than four or five courses over a semester, is attracting interest in Canada, The Globe and Mail reported. Quest University has adopted the system and three institutions -- Acadia University,  Algoma University and the University of Northern British Columbia -- have started to explore the use of block schedules. Among the small number of institutions in the United States that use the system are Colorado, Cornell (Iowa) and Tusculum Colleges.

Ad keywords: 

Montgomery College follows remedial math revolution

New Montgomery College math lab takes increasingly popular "emporium" approach to remedial math, where professors change their role to boost student success.

Essay on whether writing instructors need to assess themselves

Are we holding ourselves to the same rigorous standards we apply to our students? Are we practicing enough of what we preach?

The recent document the Framework for Success in Postsecondary Writing, developed by the Council of Writing Program Administrators, the National Council of Teachers of English, and the National Writing Project, posits eight "habits of mind and experiences that are critical for college success": curiosity, openness, engagement, creativity, persistence, responsibility, flexibility, and metacognition.

In a recent exchange on the WPA listserv (subject heading “Measuring the Habits of Mind”) several scholars in writing studies have debated the slippery question of whether these habits of mind can or should be measured or assessed. Most respondents replied with horror at the idea of such motivational terms being put under the scrutiny and micropolicing of assessment. In a passionate reply, one respondent wrote, "If we're going to assess anything, maybe we should start by looking at the conditions in which students are supposed to learn. A student can bring all the curiosity and creativity in the world into a classroom, but it won't help much if what she encounters there is an uninspired, poorly designed course taught by an ill-informed, unreflective dolt who dislikes students as much as the job of teaching (or just spends every hour lecturing 'facts' to students in the manner of Gradgrind)."

In reply to this and other posts, another respondent brought up the fact that a bibliography of selected research accompanies the framework. This teacher-scholar suggests both the importance of and the difficulty inherent in trying to assess (let alone "measure") sociological and psychological habits of mind: "I am sure that it is an odd and willful gesture of our profession, which deals with human beings, to toss so radically out a century of effort by psychologists, psychiatrists, and psychometricians to measure habits of mind — an effort still going strong, though not without plenty of caution, doubt, and resistance within those professions. Indeed, both yearning and qualms have attended the measurement of habits of mind in psychology from the beginning."

While both respondents argue important points to consider in relation to student performances in learning to write and writing-to-learn, the first one above also suggests an important consideration for writing teachers in relation to the eight habits of mind: the fact that these habits of mind should apply just as much to instructors as they do to students. If we ask students to exercise curiosity, then it is only fair to ask: Are we curious as instructors? How do we express that curiosity? Same for openness, engagement, creativity, and all the other terms. It would make little sense, one might argue, to preach to students that they should be exercising (or showcasing or practicing or honing) their engagement and creativity if they are subjected to a teacher in the classroom who drones out boring and uninspiring lesson plans in the classroom.  

Unfortunately, I do not have any magical answers to this dilemma. And I certainly do not have the type of psychometric knowledge our more social-scientifically minded colleagues possess. But I do feel the issue is a crucial one for us to consider. The best I can offer fellow teachers of writing is, let’s continue to practice metacognition in our theory and practice. Continue to read books like John Bean’s second edition of Engaging Ideas for tips, pointers, and expert guidance in ways to design inspiring and motivational writing curriculum. Continue to reflect on what students say about us in our course evaluations, and act on revising our teaching performances (and the habits of mind and action that undergird those self-reflections) if we don’t always like what they say. Perhaps readers of this article can offer further suggestions.

There’s a line from one of my favorite films, "Blade Runner," that applies to this situation. Deckard (Harrison Ford) gives a test to Rachel (Sean Young) to see (assess, measure) if she is a replicant (android) or a human being. Later, while visiting Deckard at his home, Rachel asks, “You know that void-comp test of yours? Did you ever take that test yourself?” Deckard does not reply.

I believe Rachel asks a crucial question that we as teachers should be asking ourselves at least every so often (if not every day). When students — almost always implied — ask us the same question, I hope we can learn how to offer a human-as-possible reply.  
 
 

Steven J. Corbett is assistant professor of English and director of the composition program at Southern Connecticut State University.

 

MIT Expands 'Open' Courses, Adds Completion Certificates

The Massachusetts Institute of Technology -- which pioneered the idea of making course materials free online -- today announced a major expansion of the idea, with the creation of MITx, which will provide for interaction among students, assessment and the awarding of certificates of completion to students who have no connection to MIT.

MIT is also starting a major initiative -- led by Provost L. Rafael Reif -- to study online teaching and learning.

The first course through MITx is expected this spring. While the institute will not charge for the courses, it will charge what it calls "a modest fee" for the assessment that would lead to a credential. The credential will be awarded by MITx and will not constitute MIT credit. The university also plans to continue MIT OpenCourseWare, the program through which it makes course materials available online.

An FAQ from MIT offers more details on the new program.

While MIT has been widely praised for OpenCourseWare, much of the attention in the last year from the "open" educational movement has shifted to programs like the Khan Academy (through which there is direct instruction provided, if not yet assessment) and an initiative at Stanford University that makes courses available -- courses for which some German universities are providing academic credit. The new initiative would appear to provide some of the features (instruction such as offered by Khan, and certification that some are creating for the Stanford courses) that have been lacking in OpenCourseWare.

 

 

University of Denver professor will teach course that offended students

In case that critics say shows a harassment policy out of control, U. of Denver professor plans to teach the same course that got him suspended.

Pages

Subscribe to RSS - teachinglearning
Back to Top