I grew up in the era of remarkable college presidents, individuals who were seen as public intellectuals. These leaders -- Derek Bok, Kingman Brewster Jr., A. Bart Giamatti, the Reverend Theodore Hesburgh -- spoke out on issues that extended far beyond their campuses. As they saw it, contributing to the larger public conversation on critical issues of their time was part and parcel of their role both as college/university presidents and in the years thereafter. Voices like this are disappearing, a point made all the more relevant and poignant with the passing of Father Hesburgh last week at age 97. Today’s educational leaders are vacating the bully lectern -- even on issues related to their own campuses.
With the increased craziness in current events, the void in presidential voice has become increasingly obvious. But the need for it could not be greater. Think: terrorism, young people turning to lives of violence here and abroad, Ebola, cheating in a professional sport, gridlock in government, lack of trust in police, and the beheadings of journalists and relief workers, to name but a few of the issues before us. Where are the voices of presidents of institutions of higher learning who can provide some moral grounding or an intellectual compass?
What accounts for the silence? It is obviously not one reason. One powerful argument is that speaking out on national and international issues is not the role of college and university leaders in the 21st century. The job, instead, is to run a campus as an effective business, keeping our myriad of constituencies (like shareholders) happy.
Speaking out can alienate faculty or students or parents or trustees or community members. It can impair revenue generation. We need to mediate these differing perspectives, regularly smoothing feathers and finding balance among irreconcilable positions. For public institutions, we need to please politicians if we want institutional funding, if we want a workable board, if we want state grants for students.
Adding to all this is the impact of social media; it has transformed the consequences of our speaking out; our words get truncated into short sound bites; our positions take on a life of their own, with little opportunity to clarify or rectify or inform. And even when needed corrections are made, they are hardly noticed.
I get it. It is easier and safer to be silent. The job of a college/university president is hard enough without speaking up and out. We know that even when we speak out on issues related to our institutions, which some presidents are doing, we risk being subjected to considerable criticism (often nasty and mean-spirited) and even termination. And the heat is rising: legislation was just introduced in Kansas that seeks to bar professors (and one assumes presidents) from using the titles they hold at public institutions in any op-eds they write -- quite the silencing device.
Yet, as educational leaders turn inward, we are simultaneously teaching our students the value of multiple perspectives, the importance of rigorous but civil debate, the interrelationship of the disciplines that cannot and should not be cabined into silos in real life. We are encouraging them to deal with new people and new ideas, and encouraging experimentation and innovation and risk taking. We want our students to engage actively in the local community, literally feeling and understanding the value of serving others. We want them to see their obligations to the larger world -- voting, sorting through vast quantities of data in search for truth, among other things. With a degree, we preach, comes responsibility. We argue that problem solving and critical thinking are what we teach across the disciplines, educating the thoughtful leaders of tomorrow. We pay homage to Jefferson’s notion that our democracy depends on an educated populace.
But it’s ironic. As presidents and in our lives thereafter, we are being disingenuous. We are doing one thing and teaching another. We are not acting as role models for our students -- from the top down. What we ask of our students should be the minimum of that which we ask of ourselves. We challenge our students to become their best selves. This means that as presidents and leaders, we have to speak up and out on the critical issues of our day. We may not have some unique lock on wisdom, but we certainly do not have less insight than others who voice their views.
When I was a college president, I had a piece of art by Rachel Kerwin outside my door. Amid a swirl of black and gray and white, the word “SPEAK” appears dead center in capital letters. I always said this was to remind students, faculty and staff to share openly what was on their minds when they came into my office, something that is rarely easy. It also served another purpose: reminding me to speak out, no matter how hard or risky that is. It still does.
Karen Gross is the former president of Southern Vermont College.
“The time has come to make education through the 14th grade available in the same way that high school education is now available. This means tuition-free education should be available in public institutions to all youth for the traditional freshman and sophomore years or for the traditional two-year junior college course.”
Although it may sound similar, this statement was not uttered by President Obama. It was, in fact, a declaration made by the United States’ first national commission on higher education, the Truman Commission, in 1947.
Now, more than a half century later, President Obama has given new life to the Truman Commission’s vision with his plan to make “two years of college... as free and universal in America as high school is today.” The proposal, modeled in part on programs in Tennessee and Chicago, promises to use federal and state dollars to eliminate the costs associated with tuition and fees at community colleges for students who enroll at least part-time and maintain a 2.5 grade point average. Though not as far-reaching at the Truman Commission’s plan, the Obama proposal aims to provide a debt-free route to a college education for all Americans willing to work for it.
The Truman Commission’s recommendations did not come to fruition for the same reason that Obama’s plan likely won’t: they faced a Republican Congress with little interest in supporting the president’s agenda or enacting large spending packages.
Still, historians agree that the commission’s bipartisan report -- and the debates it sparked -- changed the conversation about federal and state support for college access. It laid the foundation for the landmark Higher Education Act of 1965. And it prompted many state governments to move ahead with plans to expand public higher education, in particular by creating or enlarging community colleges, in the years after World War II. The same thing is happening today, as the news carries stories of free college plans being developed in Oregon, Mississippi, Minnesota, New Mexico and New York.
Much like the Obama administration, the 29 educational and civic leaders who served on the Truman Commission believed that Americans’ willingness to extend higher education opportunity to all would be the key to the nation’s economic and political future. They were part of a generation that had lived through two world wars and a devastating economic depression, and they were grappling with the frightening prospect of atomic warfare. Clearly framing higher education as a public good, the commission argued that an educated citizenry provided the best hope for preserving democratic freedom, achieving economic security and even promoting world peace.
Too many young people, the commission argued in 1947, faced barriers to higher education due to family income or geographic location, or on account of race, religion, sex or national origin. Since the 1930s, colleges, both public and private, had steadily increased tuition and fees, putting higher education out of reach for many families. Jewish students encountered admissions quotas at many private colleges, while African-Americans faced separate and unequal higher education in the segregated South. Such discrimination, the commission wrote, amounted to a “waste” of human talent. It was not only a blow to the United States’ image as a bastion of freedom and opportunity -- it was a threat to the national security.
At a time when the federal role in the nation’s education was minimal, the commission asked Washington to take the lead in assisting state and local governments to develop a nationwide network of tuition-free public colleges -- or “community colleges” -- within reach of every American. And although it recommended a range of federal aid programs, including a system of national scholarships and fellowships that could be used at any institution, public or private, the commission believed that access to higher education should be extended primarily through the public system. It was only in publicly controlled institutions, most members agreed, that fair treatment for racial and religious minorities could be assured and tuition and fees could be contained. Moreover, they hoped, the carrot of federal aid could also be used to encourage state governments in the South to end segregation in public colleges and universities.
The intense public debate over the commission’s recommendations demonstrated that the politics of federal aid to higher education were -- and still are -- complex. By the time the commission's report was released, the popularity of the 1944 G.I. Bill of Rights, which provided tuition and cost-of-living assistance to returning veterans, was obvious to everyone, but the commission’s vision of expanding access to higher education to all Americans still proved a hard sell.
As with the debate over the Obama plan, critics found the devil in the details. The proposal was described as too expensive, unrealistic, and even undemocratic. With federal aid, some argued, would come unwelcome federal control. Others charged that the commission’s estimate of Americans’ intelligence was simply too high, or that the nation’s economy had room for only so many college graduates. In 1952, the report of the Commission on Financing Higher Education, a study financed by the Rockefeller and Carnegie Foundations and endorsed by leaders of private institutions, suggested that higher education should be for the nation’s elite students -- the top quarter of academic achievers -- and not for the masses. The Truman Commission, by comparison, asserted that nearly half the adult population could benefit from two years of postsecondary schooling and one-third from an “advanced liberal or specialized professional education” -- just about where we are as a nation today.
But perhaps the most blistering attack came from two of the commission’s own members, both of whom were leaders from Roman Catholic education. They argued that the commission’s exclusion of private colleges from the use of federal funds for current expenditures and capital outlays would lead to a “monopoly of tax funds for publicly controlled colleges and universities” -- a concern expressed in response to the Obama plan.
Many private institutions, they feared, could not compete with a free public sector. Small colleges would close, while a higher education landscape dominated by public institutions would be vulnerable to government control and propaganda, as in the “dictatorships of Germany, Italy and Japan.” Only the existence of private alternatives, free from government oversight, could assure the intellectual freedom that democracy needed to flourish. “American democracy,” they wrote, “will be best served if higher education in the future, as in the past, will continue to be regarded as a responsibility to be shared by public and private colleges and universities.”
Criticism of the Obama plan has followed similar contours. On the left, some worry that the money could be better targeted toward those who need it. On the right, others fear that a commitment to “free” public higher education is too great a fiscal burden to bear, or that a strong public sector will diminish “market” incentives. And, as was the case 60 years ago, commentators of various stripes have pointed out the obvious fact that the production of more college degrees, by itself, will not lead to better employment outcomes or alleviate social inequality.
These criticisms may have some merit, but they miss the larger point of the president’s forward-looking vision of college access. The members of the Truman Commission understood the value of making a powerful statement. During the commission’s second meeting, in December 1946, the philosopher Horace Kallen urged his colleagues to conceive of their report as a statement akin to the Declaration of Independence or the Constitution.
“We are starting,” he said, “as a deduction from the democratic position in the field of education, a certain conception of a standard of educational living. We can’t realize it all at once. Every step in the realization is going to be a fight, just as every step in the raising of the standard of living is going to be a fight.”
Indeed, in the spirit of the Truman Commission, the Obama plan renews the nation’s promise to provide educational opportunity to all who are willing to work for it. It serves as a reminder that education is not just a private benefit, open only to those who can afford it, but a public good worthy of investment. The promise of American higher education, after all, is about more than individual job preparation. It is about the possibility for all citizens to participate in envisioning and constructing a better society.
Nicholas Strohl is a Ph.D. candidate in history and educational policy studies at the University of Wisconsin at Madison. His dissertation is entitled “Higher Education and the Public Good: The Truman Commission and the Case for Universal College Access, 1918-1953.”
An experiment was conducted a few years back that offered participants the choice between a Lindt chocolate truffle and a Hershey’s Kiss. Each was available for an attractive price -- 15 cents for the truffle, a penny for the Kiss. Three out of four chose the truffle.
Then the researchers reduced the cost of each offering by a penny. The truffle was now 14 cents, the Kiss was free. Two out of three participants chose the Hershey’s Kiss. “Free” is a powerful word.
When President Obama unveiled a proposal last month to give every student in America the opportunity to attend community college free of charge, it naturally got our attention. From the kitchen table to the corridors of Congress, people were talking about it.
Essentially, the plan proposes that the federal government pay three-fourths of a student’s community college tuition if states agree to pay the remaining 25 percent. Community colleges must commit to taking steps to strengthen programs and increase graduation rates. Students must have skin in the game, too -- they must attend college at least half-time, while maintaining a minimum grade point average of 2.5.
Instantly, the plan ignited a debate over its merits. Some say the money shouldn’t be spent on tuition, but on removing obstacles that keep students from finishing community college and furthering their education. Others contend that the proposal is an important first step toward spurring college attendance and building bridges between two-year colleges and four-year institutions. And of course, countless other perspectives abound.
While the debate has yet to be resolved, it’s clear the president has already succeeded in one sense: he’s gotten the nation to pay attention to a critical and overlooked need.
The importance of a college education is hardly a new topic of conversation. The changing U.S. economy, rising competition for gainful employment and the growing complexity of a global society have made education a new national imperative. Yet the conversation has focused primarily on the importance of a four-year degree. Our money has followed this emphasis: public and private dollars are directed into four-year universities, which have become more difficult for students and parents to afford.
In this national dialogue, community colleges have been somewhat left behind. Their per-student expenditures lag well behind those at institutions offering four-year degrees and graduate education. Yet they enroll nearly half of all undergraduates in our country, providing a first step or a second chance toward a more rewarding life. Considering that the gap between rich and poor continues to widen, community colleges have never been more important to our nation’s future prosperity.
A 2013 Georgetown University study, “Failure to Launch,” illustrates why. The study found that only half of Americans in their late 20s are employed full-time, the lowest level since 1972. At the same time, “the increasing need for skill development after high school has delayed young adults’ careers.” This explains why the goalposts of individual sustainability keep moving. The average age for financial independence in the U.S. is now 30.
For a large cross section of our country, community colleges represent a way forward. But their role and value transcend a person’s ability to get ahead. They’re also crucial to America’s ability to compete in the world economy.
It is widely accepted that our nation needs to graduate significant numbers of professionals in science, technology, engineering and math (STEM) fields to ensure our future economic competitiveness. The critical role that community colleges play in achieving this goal is less known. The most recently compiled statistics from the National Science Foundation showed that 44 percent of the 126,000 men and women earning 4-year degrees in engineering attended community college at some point. For most, this was their first foray into higher education -- and they continued on. A National Student Clearinghouse Research Center study showed that nearly 75 percent of the students who earned an associate degree and then moved to a four‐year college graduated with a bachelor’s degree within four years of transferring.
Community college is also a particularly effective pathway for underrepresented minority STEM students. The 2006 National Survey of Recent College Graduates revealed that 64 percent of American Indians, 5 percent of African-Americans and 55 percent of Hispanic engineering B.S. and M.S. degree recipients attended community college before enrolling at a four-year college.
The National Action Council for Minorities in Engineering (NACME) has crafted a strategy to reinforce this pathway. The organization partners with four-year colleges and universities and provides transfer scholarships to students earning associate degrees in engineering-related fields. NACME is also part of a collaborative working to strengthen high school STEM education for underrepresented minorities. Currently, more than 30,000 students are enrolled in school-within-school Academies of Engineering to deepen their understanding of STEM areas.
All of these are reasons why President Obama’s plan deserves support. This is exactly the kind of thinking and practice our country needs to unlock the doors of opportunity for a new generation. The higher education enterprise and American society both stand to benefit in the long run. "Free" is indeed a powerful word. In considering and debating the president’s proposal, let’s free our minds of false assumption and open them to the possibility and potential of new approaches.
Gary May is dean of the College of Engineering at the Georgia Institute of Technology.
I remember well the meeting with a senior faculty leader. It was early in my Wesleyan University presidency, and I was excited about the many things I hoped to see accomplished. We were talking about the objectives for the year that the administration would be presenting to the board of trustees, and I asked what her goals were. Clearly surprised by my invitation to help set the agenda for the university, this faculty veteran -- well respected by her colleagues and a devoted mentor for her students -- explained to me that she would do what she could to ensure that not much would change. When I seemed incredulous, she emphasized that “preventing disaster is not the same as doing nothing.”
What to make of this exchange? Emblematic of the university’s disdain for innovation? Of higher education’s notorious inertia in the face of change? Certainly the conservative dimension of academic culture is real and important, protecting the university from merely echoing microtrends that may be irrelevant, even antithetical, to quality education. But in this age of increasingly rapid change fueled by technological innovation, we might find, to paraphrase Oscar Hammerstein, that we have been protected out of all we own.
In their Locus of Authority: The Evolution of Faculty Roles in Higher Education, William Bowen and Gene Tobin defend a contemporary model of shared governance, one that emphasizes robust consultation. In the end, however, they stress that if colleges and universities are to thrive in the current environment, the power to initiate changes and make them stick must be centralized. Although recognizing that new academic programs need faculty support, they underscore that the allocation of resources and even pedagogical initiatives are going to be most successful with leadership unbeholden to any existing constituency.
Successful change will emerge, Bowen and Tobin underscore, when leaders work for the good of the university as a whole and over time. Given that both authors were themselves presidents, it is unsurprising that when they look for people thinking of the good of the whole, they find them in the central administration.
Colleges and universities have been under enormous pressure to change, and change they have. Faculty authors of the Yale Report of 1828 defended their work against critics who claimed that colleges “are not adapted to the spirit and wants of the age; that they will soon be deserted, unless they are better accommodated to the business character of the nation.” Sound familiar? The 19th-century Yale faculty pointed out that they also had to deal with alumni who complained that the college was no longer teaching the way it had decades before, just as today college-educated parents often express surprise that their children aren’t learning the same things they were taught. The notion that professors have been teaching the same things in the same way for centuries is just false.
But modifications of the curriculum, even alterations in teaching style, may not be what the “disrupters” are looking for when they talk about the importance of transforming higher education. They want universities to be more nimble, capable of responding to the needs of students and to “just-in-time” research opportunities as these emerge. Critics rightly charge that the structures of universities insulate faculty, administrators and students from many stimuli (and incentives) for change. Sure, new kinds of colleges, like Minerva, and new platforms for taking classes, like Coursera, are putting pressure on some colleges to adjust, but according to higher education’s critics, most institutions just go along their merry way.
I served as president at the California College of the Arts before coming to Wesleyan, and I’ve seen some pretty big changes at those institutions. Some of these changes came from enterprising faculty, others from students, and some from administrators who saw real advantages to altering the traditional models we were using. A few came from my own initiatives. All of them eventually required the kind of consultation Bowen and Tobin describe, though none of them would have had consensus right off the bat. In higher education, the promise of consensus usually devolves into the threat of veto. Consensus kills innovation.
At California College of the Arts (CCA), I had a key role in the momentous decision to change the name of the institution, long known as the California College of Arts and Crafts. Although I was personally committed to many of the values of the arts and crafts movement out of which the college emerged, I came to see that the name was no longer helpful in designating a vibrant place where digital design, architecture, film, fine arts and writing often intermingled in powerfully creative ways. We could have embraced the name and made it work, I thought, but the leadership of the college hadn’t been doing that.
For at least 20 years much energy had been spent complaining about or defending the name. As president, I was able to guide a process involving faculty, students and board members that eventually led to the name being changed. My major contribution was just setting parameters for the discussion and saying that we would finish our decision-making process within a year. Either we would change the name or we would not talk about the issue for the duration of my presidency. In 2003, the board unanimously approved the name California College of the Arts, with the understanding that craft, design, architecture and writing were all part of our approach to learning through the arts.
After about three years or so at CCA, I introduced the idea of an M.B.A. with a focus on design. Lots of people laughed at the idea of a business degree at an art school. Having no business background myself, I hired a faculty member, designer Nathan Shedroff, to help plan a distinctive business program that would work in our creative context. The provost (now president), Steve Beal, and the CFO, David Kirshman, were enormously helpful in launching the program, which has now received international recognition from the design and business communities.
We brought the program for detailed discussion with more faculty members only after we had done quite a lot of planning -- and knew that we would not spend very much money in advance. Before launching the program, nobody else at the institution would have been invested in its success. If we had asked for a vote, we would have lost. Now, with allied programs, a larger faculty and successful alumni, the graduate Design Business program is an important part of the wonderfully eclectic CCA mix.
If these examples from CCA depict presidential initiative, two examples from Wesleyan shine a light on how individual faculty members can create broad institutional change. Over the past 30 years, Jeanine Basinger has taken film studies from the small interest area of a few colleagues to a fragile interdisciplinary program and on to a department with its own lines and facilities. As she has told me more than once, “They tried to kill it many times.” But through her indefatigable efforts together with her example as a teacher and scholar, she turned film studies into one of the university’s most widely recognized areas of excellence. And she’s still going. Two years ago, I asked her and her colleagues to make the interdisciplinary department, including an important historical archive, into the College of Film and the Moving Image. We are now building an endowment for C-Film as a permanent part of our academic offerings.
Biologist and environmental scientist Barry Chernoff was one of several faculty who responded to my call for new academic proposals when I began my tenure at Wesleyan. A radically interdisciplinary scientist, Barry wanted to bring more collaborative research and teaching under the environmental studies tent. In 2009 we created the College of the Environment, a program in which all students have both a major in environmental science and a linked major -- be it economics or biology, anthropology or dance. In addition, there is a think tank attached to the college in which faculty and undergraduates from very different departments work together on collaborative projects. A number of important publications have already emerged, and we have built a significant endowment for the program.
In both these cases, individual faculty were not only the instigators of new programs, they were also the builders. As president, I knew when to get out of the way and also when I could help them raise additional resources to make their enterprises sustainable. This last part is often important in gaining buy-in from the faculty more generally. Raising additional resources “expands the pie” so that older departments don’t block change out of fear that they themselves will receive less.
But the notion of expanding the pie also fosters illusions because it masks the trade-offs that should be visible with innovation. If new programs become more successful, according to transparent criteria, then resources should be reallocated. That often painful process of reallocation, as Bowen and Tobin argue, is the responsibility of the administration, and, ultimately of the board. It’s up to the president and provost to explain publicly the criteria for the distribution of resources.
The faculty rightly controls the kinds of courses offered for credit, and it has clear rules for approving promotions, new classes and different modes of teaching. At Wesleyan, though, students have often played an important role, instigating changes in the curriculum by bringing their intellectual interests to the fore (we want environmental design! we want more art classes! more labs!). Recently students, along with a group of faculty who wanted to experiment with intensive teaching, were instrumental in opening up the academic calendar. They made a strong pitch to faculty leadership to offer classes in the summer, and then in the winter break. The intensive courses award full academic credit and are offered at sharply discounted tuition, incentivizing breaking away from the conventional undergraduate calendar.
These proposals had strong administrative support, and it was crucial that there were faculty leaders who were willing to try the new modes of teaching. Our summer and winter terms are small, but they are growing. They offer all students more pathways to complete their degrees, often with substantial cost savings and evidence of deep learning.
My final example of change at Wesleyan is my decision to partner with Coursera in the summer of 2012. This was an unusual moment, a time when I was convinced that we at Wesleyan needed more experimentation with online learning, a time of both MOOC mania and backlash against MOOCs. I was very impressed with Coursera cofounder Daphne Koller’s approach to building a group of strong classes through an iterative process of running them and improving them.
Since these classes were not being offered for credit, I knew I did not need authorization from the faculty as a whole. We were going to produce the classes very economically, so money wasn’t the issue. I decided to join the first group of Wesleyan teachers online, inviting some of the most respected and celebrated faculty members to join me in the experiment. We were very fortunate to enlist five colleagues from very different departments. None of us knew precisely what we were getting into, but we all were curious about pedagogical innovation and eager to share our classes for free with students from around the world.
When I announced this partnership at the first faculty meeting of the year, there was real consternation. Did I actually have the authority to do this, I was asked by one of my senior colleagues. Yes, I did, at least according to the board chair and the general counsel. I knew that the ice on which we’d started skating was thin, but in the end the success of our efforts would be judged by the individual teachers involved and then on a strategy and policy level by the relevant faculty and board committees.
If I had asked a general faculty meeting for authorization, I doubt we would have ever gotten started. Instead, I asked the faculty committees to respond to our reports on the classes we taught, to refine the process of selecting teachers and subjects, and to help determine which lessons from our online classes were relevant to our work on campus.
Wesleyan is a small place. We have around 3,000 students, mostly undergraduates. In our work with Coursera over the last few years, we have worked with more than 1,000,000 students from over 120 countries. All of us who have taught in the program find it exciting and frustrating by turns -- and tremendously invigorating. We are taking lessons into flipped classrooms as well as into more traditional seminars. The partnership with Coursera continues. We are learning together. If in the end the faculty deems the experiment a failure, then will move onto other experiments.
Changes are happening at America’s colleges and universities as faculty, students and administrators grapple with making the education they offer more empowering beyond the university. Although the faculty as a whole may sometimes function as a guardian of mission and tradition, individual professors are often catalysts for innovations that can be put in the service of broad, strategic goals. As Bowen and Tobin emphasize, strong leadership recognizes the need for faculty as genuine participants rather than as adversaries.
And it’s not just faculty who can launch sustainable change. Sometimes initiatives come from students eager to try to modes of learning, or to delve more deeply into subjects not yet well represented in the curriculum. Deans, provosts and presidents learn to get out of the way when tailwinds can carry worthwhile initiatives to fruition, but they also can themselves initiate curricular experimentation in areas where there is of yet no campus constituency for new programs.
Bowen and Tobin’s main point is as simple as it is important: effective shared governance is not divided governance. Coordinated consultation and transparent decision making can ensure that universities aren’t just protecting themselves out of all they own, but are learning how to promote inquiry, learning and creative practice in ways that remain most empowering today.
Michael S. Roth is president of Wesleyan University. His most recent books are Beyond the University: Why Liberal Education Matters and Memory, Trauma and History: Essays on Living With the Past.