Between gainful employment, talk of revamping the role of accreditation, interminable budget wrangling over Pell and student loans, and President Obama's State of the Union declaration that colleges and universities are "on notice" with regard to costs and outcomes, the complex and evolving relationship between higher education and the federal government is more visible than ever -- though on the question of what exactly that relationship ought to be, no consensus seems imminent.
In his new book, Between Citizens and the State: The Politics of American Higher Education in the 20th Century (Princeton University Press), Christopher P. Loss examines the ways in which the government's involvement in higher education has waxed and waned over the past hundred years, and how those changes have affected students and institutions -- and will continue to affect them in the years to come.
Loss, who is assistant professor of public policy and higher education at Vanderbilt University, discussed his book with Inside Higher Ed via e-mail.
Q: What does it mean to say that "higher education bridged the gap between citizens and the state"?
A: One of the key arguments of the book is that during the 20th century the federal government turned to intermediary institutions to create administrative capacity in a political culture fearful of “big government.” I contend that higher education was one of those intermediaries — it served as a key site where citizens learned about their government and the government, as a chief sponsor of higher education, learned about its citizens. This is what I mean when I say that higher education mediated relations between citizens and the state. Located at the literal and metaphoric crossroads of state-society relations, higher education is a really fruitful place to observe political and social change in the United States.
Q: How does your book "reconfigure the dominant historical narrative of the 20th-century state-academic partnership"?
A: Much of the existing scholarship on the federal-university partnership has focused on the rise of sponsored research in World War II and the widening jurisdiction of academic expertise ever since. My book is a social and political history that explores the lives of students, faculty, and administrators in and outside bounded campus settings, studying at home and around the world, as civilians and soldiers, as farmers and television viewers, as political actors and citizens. It tracks the dramatic outcomes of the federal government’s growing involvement in higher education between World War I and the 1970s, and the conservative backlash against that involvement from the 1980s onward. Using an array of primary and secondary sources, I chronicle how higher education shaped, and was shaped by, the federal government’s shifting political agenda as it moved from a preoccupation with economic security during the Great Depression, to national security during World War II and the Cold War, to securing the rights of African Americans, women, and other marginalized groups during the 1960s and ‘70s. What is the purpose of higher learning? How does it benefit society? And what sorts of citizens and politics does it produce? These are basic questions that we rarely ask and can hardly answer. These are the questions that animate my book.
Q: When and why did the federal government make "college-going ... a national priority"? How does that compare with the government's higher education agenda today?
A: Policymakers turned college-going into a national priority because they thought educated citizens were better citizens — more civically engaged, politically aware, and democratic. While it’s true that college had always been a key training ground for democratic citizenship, it wasn’t until the 20th century that the federal government started to pump millions, then billions, of dollars into research and student aid. This was a remarkable transformation considering that for most of American history the federal government went out of its way to avoid getting involved in higher education, except at the nation’s land-grant colleges, which collected modest annual appropriations for agricultural research and operating expenses. This all changed during the middle third of the 20th century, when higher education and the federal government forged a durable partnership buffeted by cataclysmic wars, economic depressions, and mass social movements. It was at this time that the federal government extended educational opportunities to individuals in exchange for their service to the state. The best example of this was the GI Bill of 1944, which offered veterans portable financial aid to go to college in return for their wartime sacrifices, and the National Defense Education Act of 1958, which provided fellowships and loans to students who promised to earn degrees in defense-related fields of study, like math, science, and the foreign languages. Starting with the Higher Education Act of 1965 policymakers veered away from the reciprocal design that lay at the heart of these earlier programs, looking instead to grants, and especially to loans, to help students and their families pay for college.
All of which is to say that aside from making more money available to help students pay for college, the federal government hasn’t had any real strategy at all when it comes to higher education for a long, long time. I think this is changing — though I’m not sure if it’s for the better. Americans are understandably worried about ballooning student debt and the lack of good jobs that graduates need to pay off that debt. President Obama recently came out in favor of tighter federal regulations on student aid and of linking the availability of aid to student outcomes, institutional performance and accountability, and cost controls. Although I think students deserve good, reliable information about the schools they wish to attend — graduation rates, actual costs, average time-to-degree, job placements, and so on — I think regulating higher education in the way hinted at by the president is a really bad idea. Another approach well worth considering — one that has deep roots in American history — is the idea of reciprocity, to link aid to some kind of national service between citizens and the state. It’s worked well before. Why not try it again?
Q: What are some of the key "ways in which psychological expertise transformed higher education"?
A: Most people are aware that the SAT exam was derived from the Army Alpha intelligence test used during World War I. But there were lots of other ways that psychologists and their allies in the social and behavioral sciences shaped higher education and changed the way policymakers and the American people understood what it meant to be an educated person. During the 1920s, for example, college leaders adopted what I call the “personnel perspective” — a belief in the malleability of individuals and institutions — and used that perspective to make their institutions more accommodating to the changing emotional needs of their students. They developed freshman-week programs, orientation classes, and mental health facilities to help students adjust to the academic and psychological challenges of going to college. During World War II, psychologists housed in the Army Research Branch, the government’s main hub for psychological research, reported that the most psychologically adjusted soldiers were also the best-educated. The idea that education embodied therapeutic power and could be used to help soldiers “adjust” and “readjust” to changing wartime experiences informed the Servicemen’s Readjustment Act of 1944 — better known as the G.I. Bill of Rights. Nearly half of the nation’s 16 million veterans went back to school with the G.I. Bill, 2.2 million of whom did so at a college or university. Finally, during the Cold War, public opinion researchers indicated time and again that the country’s most patriotic citizens were also the most educated, and by the 1960s educational attainment had become a proxy for good citizenship and psychological health. This belief spurred even more grandiose federal education initiatives during the Johnson administration’s War on Poverty, before collapsing under the weight of all sorts of countervailing evidence—not least of all the mobilization of student protesters at campuses across the country whose increasingly violent behavior struck many observers as anything but psychologically adjusted and democratic. Students seemed, well, maladjusted.
Ultimately, this crack in the consensus around democratic citizenship was the first of many to chisel away at the postwar state-academic partnership — a partnership that had created the atomic bomb, put a man on the moon, discovered countless new medical breakthroughs, and helped millions upon millions of Americans get an education. None of this mattered anymore: by the mid-1970s the government had lost faith in higher education’s capacity to synthesize democratic citizens and higher education leaders, in turn, had become increasingly suspicious of federal meddling even as they continued to demand evermore federal funding for research and student aid. Today’s chilly relationship between the government and higher education dates back to the impasse of the 1960s. What is college for? All these decades on, we are still searching for an answer.
Q: How did "the GI Bill consecrate ... the relationship between education and psychological adjustment"?
A: During World War II a redefinition of democratic citizenship occurred that merged education to psychological health and embodied them in the person of the veteran. Veterans’ success under the G.I. Bill became a critical policy touchstone for the rest of the century — fueling the public’s demand for higher learning and shaping the creation of subsequent federal education legislation, from the 1958 National Defense Education Act to the 1965 Higher Education Act. Time and again policymakers and the general public invoked the G.I. Bill when they wished to cite a government social program that really worked.
Q: What forces led to the "institutionalization of diversity"?
A: Three factors contributed to the institutionalization of diversity in American higher education. The first was the Higher Education Act of 1965, which provided college officials with the tools they needed — grants, loans, and work-study — to recruit and retain a truly diverse student body for the first time. The second was the actions of these new students — particularly African Americans — who arrived at predominantly white institutions to discover that those institutions were not at all equipped to meet their intellectual or social needs. The final factor was the response of university administrators. They quickly realized that decision-making in the name of diversity — changing up the curriculum by adding a new class or program in black or women’s studies— was a good way to manage their institutions, especially disenchanted students. I realize that this sounds somewhat instrumental, even cynical, but it was this odd and unexpected mix of federal policy, student activism, and administrative maneuvering that helped embed diversity in the modern university. Oftentimes in history things start one way and end up another. Diversity was one of those times: it began, in part, as a calculated administrative ploy and turned into a deeply valued organizing principle of the contemporary academic enterprise.
Q: How has diversity altered the role of higher education?
A: Although we tend to think of diversity in relation to admissions and the student body, its importance extends well beyond these areas. Diversity is used to describe the organization of knowledge, the structure of the extra-curriculum, and perhaps most important, to convey the economic, social, and political value of higher learning to the diverse publics that our institutions serve. Educating students in the name of diversity is what colleges and universities do. Given how institutionalized it’s become over the past three decades, I don’t think that will change. And I think we should do everything in our power to ensure that it doesn’t. We live in a diverse society and world, and our higher education system — as a steward of humane progress, discovery, and social and economic opportunity — has a moral obligation to reflect and to enhance that diversity.
Q: How has the "privatization" of higher education in the past few decades changed higher education's function as "a mediator between citizens and the state"?
A: Without the Great Depression, World War II, and the Cold War to thicken the relationship between the state and higher education, a rightward political shift commenced during the economic downturn of the 1970s that reached its climax with the election of President Ronald Reagan in 1980. Ideological differences dating back to the campus turmoil of the 1960s, combined with real financial concerns, helped to drive a wedge between the government and higher education. Funding cuts and the introduction of market-driven student-aid policies altered the nature of college going for the rest of the century and beyond. Ultimately, the drift toward “privatization” in the final two decades of the twentieth century readjusted higher education’s role as a mediator between citizens and the state once again — changing how students paid for college and moving students closer to a privatized conception of democratic citizenship inextricably tied to the “personal politics” of identity.
Read more by
Today’s News from Inside Higher Ed
Inside Higher Ed’s Quick Takes
What Others Are Reading