Curriculum

Change Through Debate

A variety of scholars have weighed in on the current debate about American political civility, noting brutal fights on the floor of Congress in the 19th century, nasty mud-slinging of U.S. presidential campaigns throughout history, and other less than impressive aspects of our cultural past. And of course, they are correct that incivility is nothing new. What makes incivility seem omnipresent is the communication environment of our day: the pressure on our 24/7 journalists to fill airtime, new venues for citizens to state their opinions -- thoughtful or lunatic -- online, and a culture that encourages unabashed self-expression.

Who thought we would see the day when CNN news anchors would read incoming “Tweets” from viewers to us in serial fashion, opening an international information channel to faceless, opinionated people with no qualification for broadcasting except time on their hands?

It was difficult not to be appalled by the excesses of campaign rally crowds during the 2008 presidential election, the displays at some health care town hall meetings this past summer, and Congressman Joe Wilson’s outburst ("You lie!"). Students of American political history put these events in context, easily, because incivility is manifest in a variety of ways during different eras. But that scholarly response seems a very unsatisfying reaction to the ill-mannered eruptions, name-calling, and sheer meanness that we find on television and our favorite internet sites, now on a regular basis. The incivility is still worrisome, even if historically predictable, and we look for a way to cope with it.

The scholarly literature on trends in civility is mixed in its conclusions, with some arguing for either a bumpy or near-linear increase of incivility in both the United States and Western Europe, others arguing that we are actually more polite now than ever in public, and still others – like myself – who posit that civility and incivility are both timeless strategic rhetorical weapons. Some people are better at using these tools than others, to achieve their goals, but a macro-historical argument about collective civility is probably a bit of a stretch and difficult to demonstrate empirically, to say the least.

The “incivility as strategy” approach fits our current circumstances, particularly the health care reform debate, fairly well. The political right now draws on Saul Alinsky’s mid-century tactics on behalf of the poor in Chicago for instruction on town meeting behavior, and the political left tries to come up with brutally effective broadcast advertisements, guided by the Republican “Harry and Louise” spots that undermined the Clintons in the 1990s. Civility and incivility are weapons, as are facts, logic, demonstrating, teaching, striking, and all the other means of persuasion one finds in the arsenal of public expression.

But perhaps the essential issue is that incivility is just more interesting than is measured, calm discussion. Incivility is intriguing, almost always. It can be downright exciting, as when blows are exchanged at a town meeting, and replayed like a train wreck on YouTube by millions of viewers. And who is not fascinated by citizens (apparently on the same side of the issues) marching with pictures of the president portrayed as both Lenin and Hitler? It is bizarre, and also hard to take our eyes off of.

As President Obama put it on a recent broadcast of 60 Minutes: "I will also say that in the era of 24-hour cable news cycles that the loudest, shrillest voices get the most attention. And so, one of the things I'm trying to figure out is, how can we make sure that civility is interesting. And, you know, hopefully, I will be a good model for the fact that, you know, you don't have to yell and holler to make your point, and to be passionate about your position."

Obama might, over the longer term, fight incivility in part by maintaining his own preternatural calm throughout incessant appearances on television. But my sense is that exciting nasty discourse needs to be matched by something that gets the blood boiling just as well, or incivility will indeed triumph in any given situation.

Soaring rhetoric from President Reagan in the past, Obama today, and others with their talents in the future may be passionate, but as rhetoric soars, it does not always argue. Great oratory gets steamed up when it expresses hopes and beliefs (e.g., Americans cannot always support other citizens financially, or health care is an inalienable “right”), not when it argues for, say, the "public option" or insurance cooperatives. So, the trick is to find mechanisms for public policy discussion that are exciting, passionate, creative, and thoughtful all at the same time.

From the ancient philosophers onward, a variety of academics across disciplines have tackled the questions of rhetoric, persuasion, political debate, and civility, and as a result, we can offer a tremendous amount of theorizing and empirical research on these topics. But that complex material simply will not penetrate or guide contemporary American public discourse any time soon. And pointing to our campuses as models -- underscoring the ways we debate and argue with respect for each other every day (or nearly every day), in classrooms, faculty meetings, symposiums, and beyond – doesn’t go very far either. It’s hard to explain unless you have lived it: Imploring political leaders or fellow citizens to look to universities as exemplars of "cultures of argument" will not work because it is too experiential in nature.

However, colleges and universities do offer more practical ideas and tools to American lawmakers, journalists, and interest group leaders, that are far more helpful and productive. There is the wonderful work by Gerald Graff and others on teaching argument and conflict, demanding that our students know how to make an argument in class, in papers, and as they go about their lives. As the years pass, these scholars have made a difference, and my bet is that their impact will be even greater as a younger generation of faculty learn how to incorporate argument into their teaching, no matter the discipline or class size.

But even more accessible than these pedagogical paradigms and tools is formal debate itself, from policy debate modeled by national championship college and university teams, to Lincoln-Douglas-style debate, and a variety of other formats that have emerged across nations. While I was only a high school debater myself, and I'm now far outside both the high school and collegiate debate “circuits," it is clear to me that if we can train our students – not only our student leaders and teams – in debate, and make it a stronger presence on campuses, we might build a more constructive public discourse with generational change. Anyone can debate – learn to make an argument, marshal evidence, rebut – with some instruction and practice. And these skills, once gained, can be translated into the sorts of forums our students will eventually find themselves in: workplace meetings, the PTA, community organizations, and in some cases, city halls and legislatures. We do not need to train a generation of lawyers, but we do need to train a generation of students who can simulate what attorneys and great debaters do as a matter of course.

There are many people, organizations and institutions that teach debate either for the classroom or for regional or national competitions, in the United States, abroad, and online (see here and here). But the basic elements are the same across formats: Argument, evidence, forced reciprocity and dialogue, equal time, and mandatory listening. These are precisely the elements missing from much of the contemporary debate about health care reform, and I predict they will be absent as well from the worrisome debates coming next, immigration policy reform in particular. These aspects of communication are the very building blocks for civility, and at this point at least, we have a deficit of them.

Those of us who study political communication used to hope – and perhaps many scholars still do – that the best American journalists would educate the public on the quickly-evolving policy issues before us, leading reasoned debate through newspapers and television programs. Some journalists give it an honest try, when they hold jobs that allow it. And we can locate a few lone heroes among the Sunday morning talking heads, if we wade through all the worthless talk of presidential popularity polls, embarrassing gaffes, and who is spinning whom. But with the financial struggles and disappearance of so many news organizations, it is difficult for any journalist – no matter how talented – to get our attention.

They compete, for better or worse, with bloggers and Twitterers, and wise information “gate-keepers” are leaving us with every passing year. It may be up to academic leaders to take on unexpected and much greater responsibility in shaping citizens, not just in our conventional ways of teaching liberal arts or specialized disciplinary knowledge. Of course we shape citizens already, but we must also figure out how to train our students for the rough and tumble they will find after they leave our contemplative campuses. It’s a jungle out there in the world of American political discourse, and our students will need to give it all some logical structure, and simultaneously invent new forms of civility for their generation.

Many colleges and universities teach public speaking at present, and some have made introductory courses mandatory in core curriculums or as part of major requirements in fields like Communications. Why not, similarly, consider formal debate training, as a mandatory – or at least greatly encouraged – aspect of a college curriculum? To my mind, it should at least be a consideration of all educators watching our national political debate in the fall of 2009. We can shut off CNN in disgust and sit in awe of some truly horrendous town meetings. But we can help things somewhat, by teaching our students both how to argue and why it is exciting to do so. College and university faculty can enhance the long-term health of political communication by focusing on the development of argumentation, in whatever form fits their courses, disciplines, institutions, and community.

Along these lines, Model United Nations is another excellent tool for teaching students how to argue respectfully and take positions they would not normally take. These programs demand more of students in a course than debate might, but as with teaching debate (in person or online), there is extensive support for instructors available for free on the Web. As with debate, the general structure of Model U.N. can be altered to fit a particular curricular goal or theme. For example, in teaching the Middle East conflicts and issues, the National Council on U.S.-Arab Relations supports a network called "Model Arab League" at both the high school and college levels. And of course, more ambitious faculty can try to fashion entirely new stakeholder-based deliberation programs, using the general rules of more established activities like Model U.N.

Our students will not – no matter how compelling and well-trained – be able to demand that their local school board follow the tight structure and rules of policy debate or of congress (on a good day). That is an absurdity. But they will have an ideal-typical model for what logical, evidence-based debate should look like, and will inevitably bring some elements of it with them to whatever table at which they find themselves. I have found in so many groups and organizations that people are generally starved for rules about how to conduct their discussions – a rationalized (in Weber’s sense) approach that might bring fairness, civility, and progress. The point is that we need to give students exemplars, somehow, so they can lead others toward structures for talking, listening, and constructive exchange, based on mutual respect and decency. And they might even bring civility to the internet, developing new ways to harness free communication in the service of democratic talk.

The truth is that while Americans pioneered a kind of democracy, we have never been particularly good at debate -- not during Alexis de Tocqueville’s era, and not today. We certainly don’t seem to have the patience for it. There have been some intriguing presidential campaign exchanges here and there, memorable moments in congressional hearings, and of course many moving orators in mainstream politics and outside of it. But we will never see the sort of civil, thoughtful, inventive debate that enables good public policy making until we inspire the young adults in our midst how to pursue it themselves.

Author/s: 
Susan Herbst
Author's email: 
info@insidehighered.com

Susan Herbst is chief academic officer for the University System of Georgia and professor of public policy at Georgia Institute of Technology.

Road Too Little Traveled

Early on, as the financial markets spiraled down and unemployment surged, some commentators argued that the national environment would provide the impetus to effect serious change in higher education. After all, they reasoned, campus stakeholders understood the seriousness of the events around them as massive layoffs were occurring, 403(b) funds were being reduced to 203(b)s and it was universally understood that no job on campus was safe, potentially even faculty jobs.

As a variety of troubling conditions became almost simultaneously woven together, it appeared as though a sea change for institutions was inevitable -- a perfect storm for change was developing over higher education. The economic downturn and associated collateral damage created urgency for all stakeholders to come together in a more politically civilized environment to invoke major shifts in how the academy operates as an organization and as a learning community.

However, generally absent from cost containment and revenue sustainability decisions are cost reallocation decisions regarding the relevance and viability of the academic portfolio. The extent to which institutions explore the financial performance, market demand and mission impact of academic programs (e.g., programs, concentrations, courses, sections) across the program portfolio is largely unknown. It is unclear if institutions have a structured process, access to the data and reporting mechanisms to inform review of programs and, subsequently, if they have the capacity to make decisions to retire/eliminate programs.

Given the significant resources allocated to academic programs, the time many programs have been in existence, and the changing market place and challenging economic conditions, a rigorous, objective review is a reasonable and necessary part of an institution’s due diligence. However, these decisions may be the most challenging of all.

Even in the face of unprecedented financial challenge, are the traditions, political forces, mission arguments and ideological posturing within the academy trumping the ability to restructure the academic portfolio, and the decision making and resource allocation structures that currently exist? Or, alternatively, is the eye of the storm of such magnitude that this level of macro change will be deferred until stimulus funding evaporates and there is a public moratorium on tuition and fee increases?

Perhaps for some regions, major restructuring will occur only when the reality of large declines in the high school pipeline make their way into annual operating budgets, and community colleges begin cannibalizing enrollments from neighboring four-year institutions.

A Case Illustration

Consider a view of the national academic program portfolio. In 2007, higher education produced 2,189,315 degrees in total across 1,079 fields of study. The distribution of degree conferrals across fields of study varies greatly, ranging from 0 to 218,212. Despite the volume of degrees conferred annually, focused on an extensive variety of fields of study, it is a reasonable assumption that not all of these programs possess either the recent historic evidence or market opportunity to support their continuation.

For illustration purposes, review the set of program viability metrics below. These are real data points of an academic program currently offered by an accredited institution. Enrollments have not grown over the past 5 years, degrees conferred have declined by 20.5 percent, projected employment of graduates in this field within the State is relatively static through 2014 and the regional competitive landscape is saturated with similar programs, as seen in the table below:

Program Landscape Determination Analysis
Has enrollment for this specific program grown at the institution? No Enrollment for the program has witnessed 0 growth from 2004-2007 with 17 degrees conferred during each of those years.
Nationally, have conferrals in this or similar degrees grown? No From 2002 to 2007, bachelor’s degrees conferred nationally in this field declined from 468 to 372 degrees, or a 20.5% decrease.
Regionally, are relevant occupations for graduates of this degree expected to increase? No Employment of graduates in this State is low and growth is expected to remain static. Specifically, employment is expected to increase minimally from 99 in 2004 to 122 occupations in 2014.
Nationally, are relevant occupations for graduates of this degree expected to increase? No Employment prospects for this field will remain relatively static at a 3.7% growth rate from 2006-2016 (or 1,000 jobs dispersed nationally) with no (0) expected annual average job openings due to growth and net replacements.
Is there a strong market opportunity for this degree program? No There are 12 regional competitors offering a similar bachelor’s degree.

Institutional leaders can use this type of analysis to make difficult, but evidence-based, decisions. There are, of course, other variables that should be considered in this context. For example, is the program directly aligned with the institution’s mission and strategic plan, and/or does it support the goals of a liberal arts education? However, a decision to maintain the program will be made based on a review of a more comprehensive set of program metrics, including projected market demand.

Adopting a Portfolio Review Process

An academic portfolio review process differs from the traditional internal review process. The internal review often focuses on such academic program elements as student achievement and learning outcomes, course scheduling, strengths of faculty, course/adviser workload and resource utilization. The review of the academic portfolio is focused on sustainability, market relevance, and viability of programs moving forward.

The results of a regular and systematic academic program viability review can help institutions creatively address a number of key challenges. As institutions identify emerging program growth areas, many have a severely restricted capacity to add new programs -- new programs that make sense in the context of emerging/evolving fields, occupations and sectors such as sustainability, energy and the health sciences. However, absent grant awards and major gifts from donors, these and other necessary new programs will not have access to the significant capital to both launch and sustain them over time.

Beyond new program development, there are also competing needs for resources to improve student retention and success; advising and mentoring, faculty enrichment, assessment, and focused student support resources. The academic resource pool should be dynamic and fluid. Programs that might be missed but are no longer necessary or relevant (based on market demand, financial performance, competitive landscape, quality, etc.) should have their resources repurposed for emerging needs or opportunities. The tradition of adding programs without changing the base is simply no longer feasible.

So, to what extent are institutions engaged in a systematic and regular evaluation of its academic program portfolio? Consider the following set of questions as an entry point to such a process:

1. If a program has neither the demand (marginal or declining enrollments) nor the market for its graduates, what other factors or rationale is used to support the program’s continuance?

2. To what extent are academic offerings directly aligned with the vision, mission and strategic objectives of your institution’s priorities? If a program is not financially viable but is clearly aligned with the mission of the institution, can the institution afford to have that program subsidized by other financially viable programs?

4. What impact does the competitive landscape for a program have on the institution’s capacity to successfully recruit students, retain faculty and sustain resources to make the program viable in the long term?

5. Do the characteristics of the program lend itself to an alternative delivery mode such as online learning?

6. If analysis suggests that a program is not financially viable, is without a market and is not mission critical, consider how those instructional, program and physical space resources could be re-tasked to address emerging needs or other mission-specific needs of the institution.

There is no question that this is a challenging area to address. There can be strong arguments to maintain programs even if those programs are not directly reflected in present or future market demand or are financially neutral. It may be that they are “untouchable” due to the core values and commitment to a broad based education. But it seems implausible to think this can be the case for all academic programs.

Creating a program viability assessment culture that objectively organizes the metrics for market demand, financial performance, mission impact and program quality appears a necessary part of institutional due diligence, especially during these economic times.

Author/s: 
Tim Mann
Author's email: 
newsroom@insidehighered.com

Tim Mann is director and senior analyst for Eduventures' Academic Leadership Learning Collaborative.

Define 'Frill' and Use it in a Sentence

Your “frill” is not my “frill. “ My frill, in fact, is an essential component of the work I do, which is an equally essential aspect of our institution’s mission. Maybe you say the same about yours.

And therein lies the heart of the difficulty in discussing what has recently become a phrase bandied about in the world of higher education. “No-frills education” has been touted by the Pennsylvania State Board of Education, the president of Southern New Hampshire University in recent attention-getting interviews, and pundits commenting on the out-of-control costs of college. If we can just strip the college experience down to its most basic form, the argument goes, we can restore sanity to the price structure and access to those who need it.

But the first challenge comes when we begin to discuss, and decide on, what constitutes a “frill.” Unfortunately, the contentious and fractured nature of higher education, long a hotbed of competing priorities, makes that a difficult conversation.

Shopping for a college education is not like buying a new car, and building an effective institution to provide that education is not like building one. If one of us goes into a car dealership with a plan to buy the most stripped-down vehicle on the lot, and we stick to that plan, we have a pretty good idea of what we will drive away owning: a car without many of the nifty features now available. No GPS, no satellite radio. We will have a smaller engine, which we understand will leave our simple little car a bit underpowered on the highway.

But we know too that we will have a car equipped with the basic safety features required by law -- seatbelts and airbags -- and that it will have the components necessary to drive off the lot: four wheels supporting a frame, powered by an engine.

But what is it about a college education that is truly essential? And how do we arrive at that conclusion? We can start with the curriculum, but if there is an institution out there that has not suffered through lengthy debates about the components of that curriculum, neither of us knows where it is. The only thing constant about the “essential” components of a curriculum has been the regular change each institution imposes on it.

Foreign languages, for example, have been a mainstay of a liberal arts education. But as demand has lessened and resources have dwindled, a number of institutions have reduced or eliminated this requirement. Skill in writing has long been one hallmark of a college education, but at many large research institutions, students can graduate having written fewer than a dozen substantive papers, many of those having been graded and returned with few comments and corrections. Colleges and universities have added, and then removed, requirements for courses addressing diversity, gender issues, global concerns.

What was essential in one decade is seen as frivolous in another. At the furthest extreme is an institution as esteemed as Brown University, which has no required courses among its thousands of offerings.

Is academic support a “frill”? If one agrees that writing is indeed an essential component, then is a writing center that provides intensive tutoring in this skill also an essential component? That’s a fairly easy argument to make. And yet, in a time of budget cuts, we have seen writing centers forced to reduce their hours and staff. At what point does this essential component become so limited that an institution’s mission is threatened?

To return to the car-buying analogy, we know that tastes and needs have an impact on standard equipment in a car, and that over time, we adjust our expectations of that equipment upward. One would be hard-pressed, for example, to find a car without a radio today. It doesn’t mean the radio hasn’t added to the cost of the car, just that we are in agreement that we will accept the cost as part of the price of the car.

But easy acceptance has never been part of academic culture. We can, and do, argue over everything from the lack of vegetarian options in the dining halls to class schedules, from the awarding of tenure to a less-than-stellar instructor to the political correctness of a mascot. Debate is, one could argue, an essential component of our mission (though we have to admit there are days when we wish it were a frill that we might be willing to do away with). The risk for our institutions is not in the content of this debate, but in the oft-reflexive assumptions we bring to the debate, which can then degenerate into a harsh and morale-sapping exchange between groups of colleagues.

“No-frills education” discussions have their common fodder: gleaming recreation centers, posh residence halls with concierge desks, heavily-funded student activities events, athletics and all its attendant costs. These are among the items that proponents of “no-frills” education seek to eliminate. The “no-frills” education offered by Southern New Hampshire University, for example, is a commuter-based approach to garnering credits; many classes are taught by the same faculty who teach at the university’s “heavily frilled” other campus. But are those students getting the same education as their peers down the road? Perhaps they don’t need a recreation center, but is there any doubt that students learn valuable skills from activities outside the classroom?

Over the past 20 years, service learning as a component of the curriculum has become increasingly common as faculty and students alike, supported by data, acknowledge the deep level of learning that takes place when students must put their classroom skills to good use in the community. What about learning to develop a budget for an organization, motivating volunteers, evaluating the success of an effort? And practically speaking, how does a no-frills education impact a student’s relationship with the institution? Will these students be loyal alums 10 or 20 years after graduation?

It’s equally critical that we remember that very few frills are either/or propositions. Most exist on a continuum of cost and usefulness. Perhaps a climbing wall (a “frill” often cited as an example of an unnecessary expenditure) isn’t a good use of campus dollars. But is a fitness center with basic cardio equipment that gives students, as well as faculty and staff, a convenient way to relieve stress and stay healthy in that same category? Similarly, a residence hall with a spectacular view of Boston’s skyline, such as the luxury accommodations recently opened by Boston University, can hardly be discussed in the same conversation as the standard double-room, shared-bath residence halls still operating on most campuses.

These debates about “amenities” versus “necessities,” about what our students need versus what they want, rage on, as they should. It is our responsibility as the keepers of our institution’s educational integrity to own these debates and decisions. If we abrogate our responsibility to do this, someone else, like a state legislator or policy maker or a popular magazine that makes a bundle on its “rankings” issue, will step in.

Who should get to decide that a particular outside-the-classroom activity is a frill? Living on campus is a “frill” in the minds of some higher education policy makers, and certainly the community college system in American has shown for a century that students can receive a good education without experiencing dorm life. But who would argue that learning to live with others isn’t a valuable skill? It’s certainly one we hope our neighbors have learned before they move into the townhouse next door.

Is residence life essential? No. Is it a frill? No. Is it somewhere in the middle? Most likely. So who on any given campus is best positioned to determine whether it stays or goes as part of a move toward “no-frills” education?

An athletics program is similarly difficult to gauge. At one of our institutions, a small, professionally focused college, athletics was eliminated without much of a fight, and the college hasn’t missed a step.

At the other of our institutions, a small, selective liberal arts college, a quarter of the students participate in an intercollegiate sport. The budget to support these efforts, while modest compared to larger schools, is not insubstantial at a time when every dollar is scrutinized. There are on this campus, as we’re sure there are on every campus, those who would characterize athletics as a “frill.”

But if we eliminated the entire program, or even a few sports, enrollment would suffer greatly as those student-athletes sought other opportunities to continue their athletic pursuits, and we would have a hard time keeping our doors open for the rest of our students. It’s also worth pointing out that on this campus, as is the case on many small college campuses, our athletes are retained at a higher rate, and receive less financial aid, than the student body in general.

Some of the “no-frill” efforts being proposed are closely aligned with a view of higher education that is more vocational in nature, more targeted at providing students with skills essential to building an effective and pliable work force to rebuild the American, and global, economy. Setting aside the enormous question of whether this should be the true purpose of a college education, we nonetheless need to consider the role of career services in this equation.

Does a “no-frills” institution help its students find jobs after education? Perhaps, but how? Does it help students identify possible internships with employers? That would be a good idea. Does it invite recruiters to campus to interview students? That makes sense. Does there need to be an employee whose responsibility it is to arrange these internships and visits? That is helpful. Should someone work to prepare these students for these interviews? Review their resumes? Help them determine which recruiters might be of interest to them? Offer a workshop on interviewing skills? Those services make sense if the institution is truly committed to helping students move successfully into the workforce. So now perhaps this institution needs a career services office to provide these opportunities, replete with staff, a small resource library, some career-oriented software supplied on office-located computers.

Frills? Yes, no, and somewhere in between, depending on the vantage point from which you approach the matter.

The point of these examples is not to lead us down a path of endless debate about residence halls, athletics, career services, student activities, or any of the “frills” that proponents of “no-frills” would like to eliminate. It’s to point out that we have, at this point, no agreed-upon framework with which to discuss and define “essential” versus “frill.”

Will these “no-frills” campuses take a pass on academic support services? How about orientation or a campus conduct system? Will faculty at these no-frills institutions be any more comfortable dealing with students in serious academic or emotional distress than our faculty colleagues are now, most of whom appear grateful to have a counseling center (which some might consider a “frill”) to refer these students? Will students with learning and physical disabilities still be able to get the assistance they need, or will anything beyond the bare minimum required by the federal government be considered a “frill” and cast aside along with the climbing wall, spring concert, turf field and whatever else is the frill-of-the-day as portrayed in the media?

We can’t, and won’t, answer yes or no to these, though we each have our opinions. We just want to propose that each institution should own its discussion about these matters. Casting aspersions on the work of others, on the contributions of that work to students and to an institution’s core mission, is not productive. What is productive is an ongoing, civil conversation about those students and that core mission, and an effort to first build a framework for that conversation that educates each of us in the work of one another.

Every institution must have its own conversation, and no two institutions will reach identical conclusions. One institution’s frill is another institution’s essential service: ours to decide, and ours to defend. Leaving the definition of “frill” to others puts us at grave risk of losing control over our very purpose. We must look inward for the anchor points of this conversation. Who are our students, and what do we owe them? What do they need from us (rather than want from us) to ensure they have the best chance of succeeding at whatever it is we have crafted as our institution’s goals? And then we must measure what we offer against those goals, rather than against the college down the road that is awash in apparent frills (which, perhaps, they don’t define that way, and that is, of course, their prerogative).

What each one of us believes is essential may not be what another believes is essential, but we do share, at our best, a deep commitment to this work of educating college students, and we each deserve a voice in the conversation.

Author/s: 
Lee Burdette Williams and Elizabeth A. Beaulieu
Author's email: 
info@insidehighered.com

Lee Burdette Williams is vice president and dean of students at Wheaton College, in Massachusetts, and Elizabeth A. Beaulieu is dean of the core division at Champlain College.

No More Fancy Fonts

It’s difficult to believe now, but not so long ago, I looked forward to making up syllabuses.

Once the grand meal of the course had been structured and I’d chosen an exciting title, the syllabus design was my dessert. I took the word “design” quite literally, having fun with frames and borders, trying out different fonts, fiddling with margins.

Then, after printing out the final document, I’d sit at my kitchen table and add images saved for the purpose from old magazines, vintage catalogs, pulp advertising, obscure books, and other ephemera. Fat cherubs blowing their trumpets would announce Thanksgiving break; a skull and crossbones marked the spot of the final exam. My masterpiece was a course on the work of Edgar Allan Poe, whose syllabus was a gothic folly with a graveyard on the front page and cadaver worms crawling up the margins.

Over time, my syllabuses grew less creative. I still gave my courses what I hoped were enticing titles, and I’d usually add an image to the front page, but nothing more. In part, I was afraid my quirky designs might make the course seem less serious; I also had far less free time than I used to. But mostly, it was the number of disclaimers, caveats and addenda at the end of the syllabus that made my designs seem out of place. All these extra paragraphs made the syllabus seem less personal, and more institutional -- but then, I realized, perhaps it was time I grew up and began to toe the party line.

Those were the good old days. Now, at a different institution, I teach in a low-residency program whose courses are taught, in part, online. The institutional syllabus template is pre-provided: Times New Roman, 12-point font, 1-inch margins -- and don’t forget the “inspirational quote” at the top of the page.

The Course Description is followed by the list of Course Objectives, Learning Outcomes, Curriculum and Reading Assignments, Required Reading, Assessment Criteria and so on, all the way down to the Institute’s Plagiarism Policy and Equal Opportunity Provisions. Colleagues tell me it’s the same almost everywhere now; the syllabus is now composed mainly of long, dry passages of legalese.

I no longer design my own course titles -- or, if I do, they need to be the kind of thing that looks appropriate on a transcript, which means “Comparative Approaches to the Gothic Novel,” not “Monks, Murder and Mayhem!” There’s an extra plague in online teaching, however, in that -- at least, at the institution where I’m currently employed -- all course materials, including weekly presentations, must be submitted months in advance.

This, I’m told, is not only to ensure that books are ordered and copyrights cleared, but also for the various documents to pass along the line of administrative staff whose job includes vetting them in order to be sure no rules have been violated, then uploading them in the appropriate format. Moreover, a syllabus, we are constantly reminded, is a binding legal document; once submitted, it must be followed to the letter. Omissions or inclusions would be legitimate grounds for student complaint.

Gone, then, are the days when I could bring my class an article from that morning’s New York Times. Now, when I stumble on a story, book or film that would fit perfectly with the course I’m currently teaching, I feel depressed, not excited. I can mention it, sure, but I can’t “use” it in the class. Nor can I reorient the course in mid-stream once I get to know the students; I can’t change a core text, for example, if I find they’ve all read it before; I can’t change the materials to meet student interests or help with difficulties, as I once did without a second thought.

This is especially perplexing in online teaching, where it’s so easy to link to a video, film clip, or audio lecture. We have an institution-wide rule that such materials may not be used unless accompanied by a written transcript for the hearing impaired. When I object that there are no hearing impaired students in my small class of six, I am told that no, there are currently no students who have disclosed such an impairment. The transcripts are needed in case any of them should do so -- in which case, they would be immediately entitled to transcripts for all audio-visual material previously used in the course. Sadly, those who pay the price for this assiduous care of phantom students are the six real students in the course.

In brief, what used to be a treat is now an irksome chore.

Instead of designing a syllabus, I’m filling out a template, whose primary reader is not the student, not even the phantom potential-hearing-impaired student, but the administrators and examiners who’ll be scanning it for potential deviations from standard policy.

Sitting at my kitchen table with scissors and glue, I always felt as though the syllabus -- and, by implication, the course -- was something that came from within me, something I had literally produced, at home, with pleasure and joy.

Now, by the time the course is finally “taught” months after the template has been submitted, it feels like a stillbirth from a mechanical mother.

Author/s: 
Mikita Brottman
Author's email: 
doug.lederman@insidehighered.com

Mikita Brottman is chair of the humanities program at Pacifica Graduate Institute.

Third Way in Liberal Education

At a recent gathering of junior faculty, convened by the Teagle Foundation to discuss the future of liberal education, a remarkable fact appeared so clearly that it went unremarked. Discussions about the value and purpose of higher education had lost the acrimonious and partisan tone that defined the culture wars of the '80s and '90s. To be sure, those present (myself included) were no doubt fairly homogeneous in our political and academic backgrounds. And we were a self-selecting group, as all had expressed interest in the value of liberal education, even if we did not agree on (or even know for sure) what exactly it was. It was nonetheless an encouraging sign – no doubt prepared by such reasoned criticisms of the academy by the likes of Derek Bok – that liberal education no longer appeared as a minefield of partisanship, but rather as the site of constructive and rational debate.

One reason, I suspect, for this development may be that some of the institutions most committed to liberal education have transformed the way in which it is taught. At Chicago, Harvard, and Stanford, for instance, freshmen are still required to take a version of a "core curriculum." But unlike Columbia’s venerable core, these newer versions all allow students to make their own choices from a selection of classes. At Chicago, students compose a three-course meal from offerings in the humanities, civilization studies, and the arts. Stanford’s “Introduction to the Humanities” (IHUM) program presents students with a slightly leaner diet: they choose from a collection of starters chosen to “demonstrate the [...] productive intellectual tensions generated by different approaches,” before tucking into a two-quarter entrée that “promote[s] depth of study in a single department or discipline.” Finally, Harvard just introduced last fall a "Program in General Education" that is more buffet style: students select courses from eight different groups, roughly half of which satisfy humanities requirements.

While in no way revolutionary, these curricular developments, I argue here, may justly be regarded as harbingers of a third way in liberal education. This new way bypasses the old battleground of the culture wars — the canon — by recognizing the privileged place that certain works and events occupy in past and present societies, without dictating which of these must absolutely pass before every student’s eyes. As opposed to the more common "general education requirements," moreover, the courses in this model also provide students with an intellectual meta-narrative, that is, a synoptic perspective linking different periods, cultures, and even (ideally) disciplines. Finally, this model can offer scholars, administrators and policy makers a new language with which to define the goals and ideals of liberal education, and to help define criteria for their evaluation.

The language currently employed to discuss liberal education has itself proven remarkably apt for avoiding partisan flare-ups. Who can object to a pedagogical program designed to improve thinking, moral reasoning, and civic awareness? Glaringly absent from such skills-oriented definitions is, of course, curricular content. While this strategy of omission has conciliatory advantages, it also carries risks: Discussions about liberal education can end up sounding terribly formalist, as though students were destined to perform ghostly mental operations in a vacuum (“practice citizenship!”). The very idea of liberal education can suffer from such excessive formalism, since, emptied of content, it risks becoming little more than a talking point or sales pitch.

This approach also ignores a penetrating criticism, made with particular (if somewhat hysterical) emphasis by Allan Bloom in The Closing of the American Mind. In the absence of any overarching curricular structure, students can easily end up losing themselves in a labyrinth of unrelated courses. These courses may individually belong to disciplines traditionally associated with liberal education, and may each, in their own way, contribute to the development of important and worthy skills. But they may also leave puzzled students wondering how, say, their knowledge of Russian history relates to their classes on French literature. Of course, there are not always clear bridges between disparate subjects. And finding your way from one point to another can itself be an intrinsic part of education. At the same time, teaching students how to integrate knowledge from different fields is a valuable skill, one which we would be rather perverse to withhold from them, particularly when it is requested.

Beneath the geographical metaphors proliferating in the above paragraph lurks, of course, the familiar fault line of curricular content. But this is precisely where the reforms of core curriculum courses at the universities listed above can provide a less contentious framework for discussion. Indeed, the dominant feature of these courses is that they combine requirement and choice; students are obliged to choose from a selection of courses. This means that a) there is a degree of personal tailoring: for instance, hardcore “techies” at Stanford can take a course on the history of science and technology; and b) the emphasis is shifted from a debate over which exact texts every student should read – inevitably a source of heated disagreement – to a debate over which different sets of texts (or historical events, or works of art, etc.) form a coherent and meaningful syllabus.

The advantages of this system are numerous, but I would like to emphasize two ways in which it offers a valuable framework for liberal education. First, in addition to the benefits gained from studying individual texts or topics, these courses provide students with an overarching narrative. It is not necessarily a teleological or master-narrative, nor need it even be a story of progress with a happy end. But it is a narrative that allows students to perceive how events or ideas transform over a considerable stretch of time and space. The IHUM course that my department offers, for example, takes the students from the Mesopotamia of Gilgamesh to the Caribbean of Maryse Condé’s Crossing the Mangrove. Our syllabus is primarily literary, but the lectures draw heavily on each text’s historical, religious, cultural, and philosophical context. In this way, such narratives also illustrate how frontiers between humanistic disciplines are not closed borders, but can be freely crossed.

Ironically, the narratives transmitted in these classes are ultimately destined to fade away, or at least be significantly transformed, over the course of a student’s education and life. Their purpose is primarily structural: to borrow a hallowed metaphor, they allow students to attach the ideas they will later acquire onto different, yet connected branches of a single tree of learning. But this metaphor is somewhat misleading, since narratives are far less wooden frames. Subsequent coursework will complicate or contradict episodes of the story students began with; and at the end of their college education, they will ideally have written their own narrative with the knowledge they have gained. But even if the initial story they were told disappears in the process, it will have served its purpose, and taught the students a valuable lesson along the way – namely, that to be persuasive citizens and scholars, we need to know how to tell a compelling narrative. The ability to piece disparate facts and ideas into a coherent whole is a critical part of liberal education. We are always putting Humpty Dumpty together again.

Second, an important criterion for composing the syllabus of these courses is that their contents be sufficiently authoritative. Here we brush up again against the touchy subject of the canon, which cannot be completely avoided, even if the model under discussion does not advocate including specific books at all costs. But the inclusion of “authoritative” works or events – and I choose this word deliberately – does strike me as a necessary part of liberal education. This is not because some works contain The Truth and others only pale reflections of it. This argument of Bloom’s, and of his predecessor at the University of Chicago, Robert Maynard Hutchins, is more likely to puzzle than to offend today (how do you teach Homer as "the truth"?). But as John Guillory pointed out in Cultural Capital, certain texts simply have (or had) greater authority in our societies: not to engage with at least some of them leaves students at a social disadvantage.

I would also argue that understanding these authoritative texts is key for achieving what Montaigne identified as the ultimate goal of education – the ability to challenge existing authorities, an ability we would today call critical thinking. If students are to challenge authorities, they must begin by knowing who those authorities are and what they argued. Only in this fashion can the students acquire both a better understanding of how and why our societies came to be the way they are, and the ability to counter authoritative accounts in a knowledgeable and evidence-based manner.

It is to be hoped that liberal education will always remain a fertile topic of discussion, and the model that the universities discussed here have adopted – with a number of differences, to be sure, which I did not address – is certainly not the only solution. Indeed, I hope that other colleges will experiment with different models, so that our collection of experience continues to grow. But the promise of the current model is that it does offer a way past the opposing camps of the canon wars, and in this regard, may come to be regarded as a third way in liberal education.

Author/s: 
Dan Edelstein
Author's email: 
info@insidehighered.com

Dan Edelstein is assistant professor of French at Stanford University.

A Program Is Not a Plan

One of the main thrusts of what has come to be called "the undergraduate student success movement" is misguided. Yes, we did mean to use the term "misguided." A strong word and a strong assertion, but we have equally strong evidence. Simply stated, higher education institutions in the United States focus heavily on student success programs, but rarely do they have a comprehensive plan to guide those programs. In the absence of a plan, redundancies and gaps occur, and retention stagnates. In short, a program or programs do not a successful plan make.

Of course, making this assertion means that John Gardner, one of this essay’s authors and a key architect in the national student success movement, has to admit that over the years he may not have given the best advice to all people at all times. For about three decades, Gardner has gone around the country telling college educators that their institutions need to adopt or adapt one form of student success program or another. Drawing from his experiences, the recommended program was often a first-year seminar -- a contemporary staple in the American college curriculum that dates back to the 1880s. And, in fact, research does correlate participation in first-year seminars with positive differences in student retention and graduation rates.

At the same time that Gardner was advocating for first-year seminars in particular, he was also advocating for a broader philosophical approach to the first year. He coined the term, “the first-year experience,” and meant it to encompass a total campus approach to the first year, not a single program. Upon reflection, it seems that speaking about one program extensively while at the same time advocating for a collective approach may have fostered a bit of confusion. And today the “first-year experience” can mean anything from a single course to a full-fledged coordinated effort to improve the first year. But it was the single course that gained the most national and international interest.

Gardner himself ran University 101, a first-year seminar at the University of South Carolina, for 25 years, and then helped replicate this course type at many other institutions. Colleges and universities often adopted first-year seminars because they increased retention rates, and thus increased tuition revenue. Educators were hunting for the silver bullet -- the “program” that would bring about miraculous student-saving and money-making results. This search for the ideal program also became subsumed under the language of “best practices.” The idea was very simple: there are best practices out there, they can be identified and replicated with minimal thought given to context, and these best practices should yield the same results everywhere. But retention improvements that resulted from one-shot programs have generally been short-lived and, taken together, have failed to move the national retention statistics in a positive direction.

Fast-forward several decades, and this search has been intensified. A plethora of organizations and consultants now exist to feed the hunger for specific programmatic solutions to the retention problem. Clearly it is time for a change.

Beginning in 2003, with support from several foundations, the Gardner Institute for Excellence in Undergraduate Education launched a process, called Foundations of Excellence in the First College Year -- a self-study and planning process designed to help campuses move beyond “programs” and “best practices” to the development of a comprehensive intentional plan for the first year. Participants in the Foundations of Excellence process are encouraged to answer a fundamental educational question: What does our college or university need to do to provide an excellent beginning experience for all students relative to our unique mission, location, and student characteristics? To answer that question an institution first needs to assess how it is currently performing vis a vis standards of excellence for the first college year. The process provides nine such standards. Finally, once the plan has been created, institutions must implement it.

But implementing a plan is more easily said than done. Our own research on the effectiveness of the institute’s work with 197 institutional participants has found that the two most significant variables that interfere with executing a plan are a change of senior leadership with its resulting destabilizing effects, and the impact of unforeseen budget cuts.

We have also learned from successes. Over 95 percent of the campuses with which we have worked report implementing action plans. An independent analysis of Foundations of Excellence found that campuses that implemented the plans to a self-reported “high degree” recorded significant first-to-second year retention rate increases -- an aggregate 5.62 percentage points or 8.2 percent higher over four years as reported by IPEDS. Institutions that did not implement their FoE action plans experienced a 1.4 percentage point decrease in retention -- in other words, if you don’t implement the plan you have, you seem to get attrition. To plan is not enough. The executed plans included a combination of changes in institutional policies, a renewed focus on pedagogy in first-year courses, and particular programs -- yes, programs -- that were intentionally selected to address the unique needs of the institution and its students. For example, institutions connected their learning community offerings with their evolving core curriculums to maximize the success of both efforts; orientation programs were expanded to include and serve previously underserved and/or completely unserved populations such as low-income and transfer students; and oversight offices and/or committees were created to intentionally connect previously disparate pieces so that learning opportunities were not left to chance.

In conclusion, our experience leads us to convey that while programs are necessary, unless they are conceived and carried out as parts of a whole, they are not sufficient. What we believe is that institutions need to undertake a thorough planning process focused on excellence in the first year. Appropriate programs and best practices can then organically emerge and/or be modified, executed, assessed, and refined in context.

Institutions cannot fulfill their potential for improving student success without a comprehensive vision for excellence in the first year. Thus, we encourage you to recognize that the future of our students is too important to leave to chance. Instead, we hope you and your institution will become more intentional and deliberate in the way you commit to first-year excellence. In the process, you will be contributing nationally as you act locally to create the change and foster growth that our students and country require.

Author/s: 
John N. Gardner and Andrew K. Koch
Author's email: 
info@insidehighered.com

John N. Gardner is president of the John N. Gardner Institute for Excellence in Undergraduate Education and distinguished professor emeritus and senior fellow at the National Resource Center for the First-Year Experience and Students in Transition, at the University of South Carolina.

Andrew K. Koch is vice president for new strategies, development, and policy initiatives of the John N. Gardner Institute for Excellence in Undergraduate Education.

Lost in the Middle

Most attention is paid at institutions of higher education to the beginning and end of undergraduate studies. Curriculum committees debate the nature and number of requirements that students must fulfill, mostly in their freshman year; and departments spend a great deal of time evaluating the content and structure of majors, which tend to occupy students in their junior and senior years. No one gives much thought to what students do in the middle, when they're generally encouraged to explore whatever topics they wish.

The principal philosophy that governs this middle period of a student's education is of course the elective system. The right for all students to take a class on the subject of their choosing is a hallmark and admirable feature of the American university. It is often through such chance encounters with less common subjects that scholarly passions are born and majors are chosen. No one studies linguistics or anthropology in high school.

But because the elective system is so fundamental to higher education, and because the major is under departmental control, we rarely step back and ask whether this combination of general ed requirements, electives, and specialization actually meets the objectives of a liberal education. Of course, the answer to this question depends largely on how one defines liberal education. For the sake of argument, let’s take the definition offered in the 2009 Modern Language Association Report to the Teagle Foundation on the Undergraduate Major in Language and Literature. This report identified the acquisition of broad, cross-disciplinary and transhistorical "literacy" as a central component of liberal education (scientific literacy would be another component, but that’s a different story). In other words, students should be sufficiently well versed in an array of humanistic fields, canons, methodologies, and periods, for them to engage with sources (and pursue further research, if they wish) in a large number of areas. To be sure, we expect a lot more from liberal education than this single aim; this is simply a minimalist definition.

Given this definition, it seems fair to say that we place blind faith in the academic virtues of our current system. We simply assume that somewhere along the way, between fulfilling their general education and major requirements, students will pick up enough knowledge about other fields to meet the demands of a liberal education.

It is easy to understand why we place such faith in this system, since there is no obvious, acceptable alternative. Institutions such as St. John’s College, whose curricula are set in stone, will only ever cater to a tiny minority of students; even Columbia University’s two-year core curriculum is highly exceptional. As Louis Menand recently noted in The Marketplace of Ideas, it is virtually impossible to imagine introducing a curriculum such as Columbia’s core today; such highly regimented courses could only evolve under particular historical circumstances. The vast majority of students today desire a greater say about the content of their education. And we must honor this desire, if only because students who do not buy into their educational program are unlikely to be good learners.

There are other ways, however, to think about the middle part of undergraduate education, particularly in the humanities. Let us focus momentarily on students who major in the humanities. Whether students chose to major in English, religious studies, anthropology, or history, there are in fact no structures in place to encourage or enable them to acquire a solid foundation in other disciplines, cultures, literatures, and historical periods. The student writing her honors thesis on Alexander Pope often does not know who Pope Alexander VI was.

Moving now to all undergraduates, I would push this argument even further. Why is it that the vast majority of humanities courses are taught as if we were training students to professionalize in a given field (say, French), when only a tiny fraction of these students – non-majors and majors alike – are actually going to pursue a graduate degree in the field? Whether a student is majoring in engineering and taking a French class out of a love for French literature, or whether she’s a French major and is required to take a French class, chances are that she is not going to become a professor of French. And yet our humanities majors, and our undergraduate curricula more broadly, are designed to produce budding experts in fairly narrow fields. This design is understandable in fields such as economics or engineering, where students often do go on to take jobs in which they need specific skills and knowledge. But why is it so in the humanities?

To be sure, specialization, even at the undergraduate level, has its virtues: engaging with material at a higher level of expertise allows students to hone their research skills and to produce more consequential bodies of work (such an honors thesis). Still, I would ask whether our primary objective, as humanities professors, should be training students as though they will all go on to become scholars, or whether our primary objective shouldn’t be something else – such as offering all undergraduate students a broader and less discipline-focused foundation for their future lives.

This issue seems particularly pressing today, as the humanities have gone from facing an existential crisis, to literally fighting for their existence. If smaller departments (such as those that were just axed at the State University of New York at Albany) continue to justify their academic purpose chiefly in terms of number of majors, then they will perennially fear (and often face) the chopping block. Admittedly, such a change would also require a shift in perspective on the part of the administrative powers-that-be. But if humanists made a stronger case that the chief purpose of a liberal education is not disciplinary specialization, but broad historical and cultural literacy, then universities simply could not make do without Greek epics, French classical theater, German philosophy, or Russian novels.

What would a curriculum reconfigured along these lines look like? One option would be for humanities departments to join forces to offer genuinely interdisciplinary core courses on major topics of interest. An art historian could team up with a literature professor and religious studies scholar to teach a course on the Renaissance; a historian, political theorist, and Spanish professor could offer a course on the discovery of the New World; or a philosopher, psychologist, and musicologist could lead a course on Modernism. These courses, which would need to be vetted by appropriate faculty committees, would stem from faculty interest, and could vary over time.

This curricular structure presents a number of advantages over the existing one. First, by virtue of having courses team-taught and not placed under the auspices of a single department, they would not have a narrow disciplinary focus, but would open up key events or questions to a variety of approaches. (This is currently the structure adopted at Stanford for the fall Introduction to the Humanities courses.) At the same time, professors could underscore the methodological differences between their disciplines, thereby providing students with a roadmap of how knowledge is divided between the various academic departments (and where to look for classes in the future).

Secondly, by requiring these courses to cover broad topics, they would collectively constitute an overarching panorama of the humanities. This would be a disjointed panorama, to be sure, yet that might be a quality, since it would avoid the problems associated with establishing a grand récit. If this panorama resembles an exploded version of an ideal, inaccessible core curriculum (“These fragments I have shored against my ruins”?), this is ultimately a misleading resemblance. Since the various pieces of this series would constantly be changing, it is not a palliative for a "Great Books" curriculum, in an age that has turned against such courses, but rather the product of a different pedagogical philosophy. Rather than valuing certain specific texts more than others, this philosophy places value on the breadth of knowledge, and the ability to synthesize very different forms and genres of information, from plays and paintings to maps and graphs.

The truly thorny issue that every curricular reform faces is that of requirements. If we build a new program, will anyone come, if they’re not obliged to? One option would be to require students to take, say, two or three of such courses at some point during their studies. This arrangement grants students a degree of choice and a good deal of scheduling flexibility. Other incentives could be found to encourage students to take more than the bare minimum of courses: completion of additional courses could lead to some sort of certification, or could form part of an honors program.

Since a central objective of a liberal education is to ensure breadth of knowledge, it follows, to my mind at least, that a significant humanities requirement is needed. In cases where this is impossible for pragmatic or philosophical reasons, I would argue that it is still important to provide students with a curricular structure that would allow them to achieve the goals of a liberal education on their own. This is particularly true for non-humanities majors, who often do not venture into humanities classrooms, not necessarily due to a lack of interest, but because of the highly specialized focus of most courses. They also may simply not know where to look: our courses are not listed in a central place, but buried behind individual department nomenclatures.

Our academic divisions may make sense for research purposes, but are often at odds with our pedagogical goals. The MLA Report to the Teagle Foundation identified four “constitutional elements” that it considered key to liberal education – "a coherent program of study, collaborative teamwork among faculty members, interdepartmental cooperative teaching, and the adoption of outcome measurements" – yet the first three of these four elements cannot be achieved at the departmental level alone. To fulfill the promise of liberal education, we must ensure that students can build “coherent programs of study” that cut across disciplines.

Finally, perhaps we should have more confidence in the wares we’re vending. Wide-ranging courses that combine powerful texts, vivid iconic material, controversial ideas, and dramatic historical episodes, with insightful analysis should not fail to exhilarate students. Of course, good professors, catchy titles, and intriguing perspectives are also needed to invigorate the study of our disciplines; a dry "introduction to X" approach will never be sufficient to meet the goals of a liberal education. But there is also a real thirst for this kind of knowledge, and not only among students in the humanities. Who knows? Maybe if we build it, they will come.

Author/s: 
Dan Edelstein
Author's email: 
info@insidehighered.com

Dan Edelstein is assistant professor of French at Stanford University.

New Programs: Nursing, Biological Sciences, Technology Games

Smart Title: 

New Programs: Media, Interior Design, Educational Leadership, Criminal Studies, Medicine

Smart Title: 

New Programs: Environmental Studies, Education, Criminal Justice, Pharmacy

Smart Title: 

Pages

Subscribe to RSS - Curriculum
Back to Top