Lincoln University -- a historically black university located in Jefferson City, Mo. -- suspended its major in history on its 150th anniversary. Explaining why that step was necessary, the president of the university emphasized, “We must make decisions like these as we look toward the future and the needs of the changing workforce.” Embedded within that statement is a declaration about higher education and its purpose: higher education should make good, high-paid workers. We should step back and ask whether this is really what we want from higher education.
Since I took my first academic position in 2010, I have continually heard in the news media, from visiting speakers and many other people that transforming students into employees is the purpose of higher education. Whenever I hear this, I cannot help but recall one particular graduate seminar when we discussed the writings of Marxist Louis Althusser. The discussion turned to higher education, and some people in the class claimed higher education was little more than part of a plot to provide good and obedient workers to the bourgeoisie. At the time, I thought that was overly reductive. I mean, we were talking about the supposed conspiracy of the bourgeoisie in class at an institution of higher education; surely this was not part of the plan.
Once I got my first academic job, however, I learned that this really was the perennial question in higher education. What should our general education curriculum look like? On which majors should we focus our resources? The answer was always put in the form of another question -- what do employers want from our graduates?
Perhaps because of the rising costs of higher education, politicians have increasingly said that the point of higher education is for students to make lots of money in their chosen careers. Is that what we want from higher education? Maybe a better question would be is that the only thing we want from higher education?
In her recent article in The American Historian, Nancy F. Cott indicates it is hard for humanities degrees -- like history -- to compete with degrees related to engineering if the only significant variable is potential earnings. One study found that throughout their careers, engineers consistently earned more than graduates in the humanities. But then, not everyone wants to be an engineer. As Cott phrased it, neither would we really want “to see an educated world populated by engineers only.” The fact is people educated in the humanities go on to important, although often not quite as lucrative, careers in education, government, law and a host of other interesting and relevant occupations.
Since students enter into significant debt to earn their diplomas, it seems reasonable for students to expect some return on their often significant investments. I hope as we review what we value in education, however, we do not simply ask which majors lead to the most lucrative careers.
Du Bois and Shaping Lives in the Present
What is higher education for? Should it exist solely for the purpose of manufacturing workers who make the greatest amount of money? It’s not a new question. It’s one that the renowned African-American historian W. E. B. Du Bois wrestled with in his speech commemorating Lincoln University’s 75th anniversary in 1941. He worried that the temptation would “come and recur to make an institution like this, a means of earning a living or of adding to income rather than an institution of learning.” Du Bois believed the kind of students Lincoln produced would end up changing the world for the better -- that it would be Lincoln students who would “show the majority the way of life.” Not from privileged and “powerful groups which from time to time rule the world have come salvation and culture,” he said, “but from the still small voice of the oppressed and the determined who knew more than to die and plan more than mere survival.” In short, Du Bois hoped that Lincoln would become “a center where the cultural outlook of this country is to be changed and uplifted and helped in the reconstruction of the world.”
Why did Du Bois believe that students at a university like Lincoln would be so influential? Du Bois recognized the power of history to shape lives in the present, and he rightly believed that this nation needed more diverse students if the status quo was ever going to change. In Du Bois’s day, history was being used to justify violence against African-Americans. In 1915, the original version of The Birth of a Nation premiered in the United States. In that movie, President Woodrow Wilson’s book History of the American People was regularly quoted. Audiences around the country saw Wilson declare through this movie that Reconstruction had been a misguided failure during which “the negroes were the office holders, men who knew none of the uses of authority, except its insolences.”
Wilson and many other people in the academy were part of what eventually became known as the Dunning School of Reconstruction History. For William Dunning, the historian for whom the broader school was named, Reconstruction was a failure because great numbers of the recently emancipated slaves “gave themselves up to testing their freedom. They wandered aimless but happy through the country.”
According to Dunning, it was Southern whites who “devoted themselves with desperate energy to the procurement of what must sustain the life of both themselves and their former slaves.” Lesson learned: black political participation meant misery for all, but exclusive white control meant the best for both black and white Southerners. The Dunning School of Reconstruction History justified the exclusion of black people from politics, and it implicitly justified the violence used to maintain that exclusion.
W. E. B. Du Bois labored to contradict those impressions. In his now widely read TheSouls of Black Folks, Du Bois argued that it was not the irresponsible silliness of black people that doomed Reconstruction but rather the impossible problems facing the recently freed slaves. Reflecting upon the failure of efforts to make Southern African-Americans truly free, Du Bois noted that the Freedmen’s Bureau could not even “begin the establishment of goodwill between ex-masters and freedmen,” and perhaps most important, it could not “carry out to any considerable extent its implied promises to furnish the freedmen with land.”
Adding to the impossible challenge was the fact that much of the legislation created during Reconstruction was intended to punish the white South rather than empower the recently emancipated. As viewed by Du Bois, black equality was a cudgel used to punish the rebellious South rather than a goal in and of itself. Without any real support for black equality in either the North or the South, how could we expect anything but failure from Reconstruction? Because of those failures, black people suffered under the weight of white supremacy.
White historians largely ignored Du Bois’s conclusions for years; it was not until higher education expanded to include a wide swath of the American population -- due in large part to the GI Bill -- that more historians came to accept what he had long argued. Today, the vast majority of historians of Reconstruction accept his premise that many capable black politicians participated in the Reconstruction. Many worked to expand roads and education to include a plurality of the Southern population. At the time, their opponents saw this as waste and corruption, but the vision of those black politicians more closely aligned with our own expectations. We -- like they -- expect our governments to maintain public roads and public education. History looks different from the bottom up.
Reversing Dominant Narratives
Du Bois did not mention the degree in history specifically in his speech in 1941, but his life’s work demonstrated the importance he placed upon the historical imagination. He correctly predicted that making the academy more diverse would change the world for the better. History has been used to justify white supremacy, and it has been used to undermine it as well. As the population of historians changed, so too has the accepted narrative of the academy. That’s why Du Bois did not ask what majors earned the most money upon graduation but had a loftier vision for Lincoln’s future. America needed impassioned graduates from schools like Lincoln. Someone had to help reverse the dominant narratives prevalent in 1941 about black inferiority.
On Lincoln University’s 75th anniversary, Du Bois provided a powerful argument in favor of empowering Lincoln’s students to go and change the world. I fear that the end of history at Lincoln University means students will have less ability to do so in the future. That saddens me, because our national history is particularly relevant today. In 2016, a reinterpretation of The Birth of a Nation is set to debut and likely make radically different claims than its 1915 namesake. Why did the creators of this new movie -- which will document the slave rebellion led by Nat Turner -- give it that name? In 2016, some people have suggested that the civil rights movement of the 1960s was relatively short and its goals were largely accomplished. How then do we explain the emergence of the Black Lives Matter movement? Do these protesters fail to understand just how racially progressive our country has become? In 2016, some politicians have suggested that the United States is a nation founded by white ideas -- or “Western civilization” -- and people of color are guests. Are they right?
Our history as a nation has been used to answer those kinds of questions, and someone is going to be answering these questions in the future. In addition to asking what employers want our graduates to do, we should also ask whom we want to answer such important questions.
Graduates -- whether in the humanities, sciences or engineering -- will continue to get relevant and interesting jobs. Some will get paid more than others. In finding the right major, students will have to make strategic choices about what they want for their lives. Having spoken with many students, I know many are not so single-mindedly focused upon profit. Many have more philanthropic purposes in mind for their education. By so circumscribing the range of possibilities, however, we are creating a future in which Lincoln’s graduates will be able to get jobs but maybe not make history.
J. Mark Leslie is an associate professor of history at Lincoln University.
In this hypothetical case study, Barbara McFadden Allen, Ruth Watkins and Robin Kaler explain how college leaders can -- and must -- surround themselves with a team of wise people with competing viewpoints.
As students prepare to return to school for the coming academic year, there are 65,000 high school seniors who lack a clear path to college because they are undocumented. While undocumented students have access to K-12 public education, their options abruptly become scarce when they turn 18: in addition to the barriers that many low-income students face, these students must navigate a higher education system that excludes them, either explicitly or de facto.
One glaring obstacle is that undocumented students are ineligible for federal financial aid. Another is that access to public institutions, usually the most affordable option, varies by state. While some states offer resident tuition and state financial aid, others prohibit undocumented students from enrolling altogether. Other states fall in the middle of the spectrum, providing in-state rates to students with Deferred Action for Childhood Arrivals at some public universities. (A federal administrative policy implemented in 2012, DACA provides Social Security Numbers and the eligibility to work and drive to individuals who arrived in the United States as children and meet certain age and education requirements. However, it does not provide a path to citizenship. Since its implementation, roughly 700,000 undocumented youth and young adults have received DACA status.)
Given this landscape, private colleges and universities have an opportunity to be key players in promoting higher education access for undocumented students nationwide. Most, though not all, selective private institutions already accept undocumented or “DACAmented” students, but as of now, information and resources for undocumented applicants are difficult to find. So difficult, in fact, that students have taken the issue into their own hands: a group of undergraduates at Harvard University started a nonprofit, Higher Dreams, to serve as a “comprehensive resource” for undocumented applicants interested in applying to private colleges and universities. Sarahi Espinoza Salamanca, a student from California, created the DREAMer’s Roadmap app to help undocumented students find scholarships for college.
Meanwhile, institutions themselves should do their part and take a far more deliberate approach: there is a great difference between accepting students and making college truly accessible. If they are serious about their stated commitments to access, opportunity, and diversity, they should recognize their potential to make a difference. They should anticipate and welcome applications from undocumented students, actively make an effort to understand their circumstances and specific needs, and adopt policies that follow through on meeting those needs.
Colleges can take several steps. First, they can educate admissions staff so that potential applicants who are undocumented will receive accurate information. Better yet, they can hire or designate a staff person to specialize in working with undocumented students. Unfortunately, that is not the norm; many admissions personnel, though well meaning, are not equipped to answer questions from undocumented applicants. Staff education is a basic and important place to start.
Another key to increasing access is changing admissions and financial-aid policies to reflect the reality of undocumented students’ lives. Many independent colleges count them as international applicants -- a highly competitive pool. Accepted students are often charged international tuition rates, which are prohibitively high even for middle-income families, and they are only eligible for competitive merit scholarships. Implicit in this policy is the idea that undocumented students are more aptly compared to international students than to American citizens, which is patently inaccurate. Having attended American high schools and spent a significant, formative part of their lives in the United States, they should be considered within that context, not judged alongside international applicants whose experiences are virtually incomparable.
Experiential similarities and moral arguments aside, students with DACA work and have Social Security numbers -- like their American peers, and unlike international students. With or without DACA, they pay taxes. The only practical difference between them and their citizen peers, then, from an admissions perspective, is their lack of access to federal aid or loans. Admissions and financial-aid policies should reflect that reality and consider undocumented students as domestic applicants, eligible for aid based on demonstrated need.
Finally, institutions should publicize their commitment to working with undocumented students, who too often go unacknowledged. If a college or university already accepts undocumented students, it should shift from a don’t ask, don’t tell mentality to one of active inclusion. Some institutions have dedicated admissions pages specifically for undocumented students that include FAQs, resources and contacts. Publicizing such information is a small but meaningful act: it provides targeted support, which undocumented students so rarely get, and makes a statement that they are truly welcome.
In essence, it is simply not enough for colleges and universities to accept undocumented students tacitly and passively. It is not enough to accept undocumented students but then charge exorbitant tuition. If an institution welcomes undocumented students in principle by allowing them to apply, then those students deserve the same level of targeted support that American citizens receive when it comes to the application process and financial aid -- not to mention student services once in college.
Some institutions are already leading the way. Oberlin College, for example, encourages undocumented students to apply, counts them as domestic applicants and provides need-based aid. Emory University recently adopted the same policy for students with DACA. (The state of Georgia, meanwhile, legally blocks undocumented students from enrolling in its top five state schools, so Emory has made a statement by providing an alternative option.) Tufts University “proactively and openly” recruits and provides aid for undocumented students, with or without DACA, and Swarthmore College rolled out a similar policy this spring, arguing that as a campus that values “different viewpoints, identities and histories among our students,” it invites all students, regardless of citizenship status, to apply.
The intentional nature of these policies and the tangible changes to the institutions’ recruitment and financial-aid strategies are what make their statements more than just lip service. Many more institutions should follow suit.
Lily McKeage is a recent graduate of the Harvard Graduate School of Education and program director at YES Scholars in New York City.
Over the weekend I went through the fall 2016 catalog of every publisher belonging to the Association of American University Presses. Or at least I tried -- a number of fall catalogs have not been released yet, or else the publishers have hidden the PDFs on their websites with inexplicable cunning. (It seems as if savvy publicists would insist that catalogs be featured so prominently on the homepage that it’s almost impossible to overlook them. Perhaps half my time went to playing “Where’s Waldo?” so evidently not.) A few sites hadn’t been updated in at least a year. At one of them, the most recent catalog is from 2012, although the press itself seems still to be in existence. Let’s just hope everyone there is OK.
After assembling roughly 70 catalogs, I began to cull a list of books to consider for this column in the months ahead, which now runs to 400 titles, give or take a few, with more to be added as the search for Waldo continues. When you take an overview of a whole season’s worth of university-press output in one marathon survey, you can detect certain patterns or themes. A monograph on the white-power music underground? Duly noted. A second one, publishing a month later? That is a bit more striking. (The journalistic rule of thumb is that three makes a trend; for now, we’re left with a menacing coincidence.)
Some of the convergences seemed to merit notice, even in advance of the books themselves being available. Here are a few topical clusters that readers may find of interest. The text below in quotation marks after each book comes from the publisher’s description of it, unless otherwise specified. I have been sparing about the use of links, but more information on the books and authors can be readily found online.
“Whither democracy?” seems like an apt characterization of quite a few titles appearing this autumn and early winter. Last year, Jennifer L. Hochschild and Katherine Levine Einstein asked, Do Facts Matter? Information and Misinformation in American Politics, published by the University of Oklahoma Press and out in paperback this month, concluding that “citizens’ inability or unwillingness to use the facts they know in their political decision making may be frustrating,” but the real danger comes from “their acquisition and use of incorrect ‘knowledge’” put out by unscrupulous “political elites.” By contrast, James E. Campbell’s Polarized: Making Sense of a Divided America (Princeton University Press, July) maintains that if the two major parties are “now ideologically distant from each other and about equally distant from the political center” it’s because “American politics became highly polarized from the bottom up, not the top down, and this began much earlier than often thought,” meaning the 1960s.
Frances E. Lee sets the date later, and the locus of polarization higher in the body politic, in Insecure Majorities: Congress and the Perpetual Campaign (University of Chicago Press, September). She sees developments in the 1980s unleashing “competition for control of the government [that] drives members of both parties to participate in actions that promote their own party’s image and undercut that of the opposition, including the perpetual hunt for issues that can score political points by putting the opposing party on the wrong side of public opinion.”
Democracy: A Case Study by David A. Moss (Harvard University Press, January 2017) takes fierce partisanship as a given in American political life -- not a bug but a feature -- and recounts and analyzes 19 episodes of conflict, from the Constitutional Convention onward. Wasting no time in registering his dissent, the libertarian philosopher Jason Brennan comes out Against Democracy (Princeton, August) on the grounds that competent governance requires rational and informed decision making, while “political participation and democratic deliberation actually tend to make people worse -- more irrational, biased and mean.” The alternative he proposes is “epistocracy”: rule by the knowledgeable. Good luck with that! Reaching that utopia from here will be quite an adventure, especially given that some voters regard “irrational, biased and mean” as qualifications for office.
Fall, when the current election cycle ends, is also be the season of books on the Anthropocene -- the idea that human impact on the environment has been so pronounced that we must define a whole phase of planetary history around it. There is an entry for the term in Fueling Culture: 101 Words for Energy and Environment (Fordham University Press, January), and it appears in the title of at least three books: one from Monthly Review Press (distributed by NYU Press) in September and one each from Princeton and Transcript Verlag (distributed by Columbia University Press) in November. Stacy Alaimo’s Exposed: Environmental Politics and Pleasures in Posthuman Times (University of Minnesota Press, October) opens with the statement “The Anthropocene is no time to set things straight.” (The author calls for “a material feminist posthumanism,” and it sounds like she draws on queer theory as well, so chances are “straight” is an overdetermined word choice.)
The neologism is tweaked in Staying With the Trouble: Making Kin in the Chthulucene (Duke University Press, September) by Donna J. Haraway, who “eschews referring to our current epoch as the Anthropocene, preferring to conceptualize it as what she calls the Chthulucene, as it more aptly and fully describes our epoch as one in which the human and nonhuman are inextricably linked in tentacular practices.” Someone in a position to know tells me that Haraway derives her term from “chthonic” (referring to the subterranean) rather than Cthulhu, the unspeakable ancient demigod of H. P. Lovecraft’s horror fiction. Maybe so, but the reference to tentacles suggests otherwise.
A couple of titles from Columbia University Press try to find a silver lining in the clouds of Anthropocene smog -- or at least to start dispersing them before it’s too late. Michael E. Mann and Tom Toles pool their skills as atmospheric scientist and Pulitzer-winning cartoonist (respectively) in The Madhouse Effect: How Climate Change Denial Is Threatening Our Planet, Destroying Our Politics and Driving Us Crazy (September), which satirizes “the intellectual pretzels into which denialists must twist logic to explain away the clear evidence that man-made activity has changed our climate.” Despite its seemingly monitory title, Geoffrey Heal’s Endangered Economies: How the Neglect of Nature Threatens Our Prosperity (December) is actually an argument for “conserving nature and boosting economic growth” as mutually compatible goals.
If so, it will be necessary to counter the effects of chickenization -- which, it turns out, is U.S. Department of Agriculture slang for “the transformation of all farm animal production” along factory lines, as described in Ellen K. Silbergeld’s Chickenizing Farms and Food: How Industrial Meat Production Endangers Workers, Animals and Consumers (Johns Hopkins University Press, September). Tiago Saraiva shows that the Germans began moving in the same direction, under more sinister auspices, in Fascist Pigs: Technoscientific Organisms and the History of Fascism (The MIT Press, September): “specially bred wheat and pigs became important elements in the institutionalization and expansion of fascist regimes …. Pigs that didn’t efficiently convert German-grown potatoes into pork and lard were eliminated.” A different sociopolitical matrix governs the contemporary American “pasture-raised pork market,” of which Brad Weiss offers an ethnographic account in Real Pigs: Shifting Values in the Field of Local Pork (Duke University Press, August).
And finally -- for this week, anyway -- there is the ecological and biomedical impact of the free-ranging creatures described in Peter P. Marra and Chris Santella’s Cat Wars: The Devastating Consequences of a Cuddly Killer (Princeton, September). Besides the fact that cats kill “birds and other animals by the billions” in the United States, the authors warn of “the little-known but potentially devastating public health consequences of rabies and parasitic Toxoplasma passing from cats to humans at rising rates.” The authors also maintain that “a small but vocal minority of cat advocates has campaigned successfully for no action in much the same way that special interest groups have stymied attempts to curtail smoking and climate change.” I write this while wearing a T-shirt that reads “Crazy Cat Guy” but will be the first to agree that the problem here is primarily human. There’s a reason it’s called the Anthropocene and not the Felinocene.
A number of other themes and topics from university-press fall books offering might bear mentioning in another column, later this summer. With luck, the pool of candidates will grow in the meantime; we’ll see if any new trends crystallize out in the process.
College is not free, and never will be. Someone is always paying -- taxpayers, private donors, students or some mix of the three. That obvious truth is missing from much of our political debate and the growing panic over student loans, which casts education debt as a tragedy rather than an investment. The hardening rhetoric against student loans threatens to undermine national success in broadening access to higher education, discouraging the very students we need.
This may sound strange coming from someone whose signature career achievement is a no-loans aid program. The whole idea behind the Carolina Covenant, which we launched at the University of North Carolina at Chapel Hill in 2003, was to assuage growing worry about student debt by eliminating loans for our lowest-income students.
But if we’re going to put higher education within reach of the millions more who would benefit, loans are going to be a crucial part of the equation. And that means students from all backgrounds -- especially low-income, first-generation and minority students -- need to understand reasonable student debt as an opportunity, not a crushing burden.
Middle- and upper-income families already have that view, which is why they’ve been willing to shoulder modest loans to earn valuable degrees. The vast majority of the increase in aggregate debt over the past few years -- the much-decried $1.3 trillion in student debt -- has come from more Americans pursuing a degree, a public policy success we ought to be celebrating. Millions of Americans have correctly seen higher education as a bridge to a better future.
For low-income and first-generation students, that bridge too often looks like a trap. Even modest loans can be frightening for families that have no experience of college investment, so they’re less willing to take that step. Overblown angst about debt threatens to entrench this class divide in ways that will prove deeply destructive to American higher education.
The promise of a no-loans education is as much about communication as about financing. For our lowest-income students at Carolina, it was meant to overcome the impression of debt as a hardship and a barrier. Our own research showed that it wasn’t a hardship -- students taking out modest loans for a quality education are almost invariably better off. But the perception was so strong among historically disadvantaged families that a no-loans promise for those students made sense.
We’re fortunate to have the resources for such a program, but most colleges and universities don’t -- especially not the regional public universities and community colleges that serve a disproportionate share of first-generation and minority students. If we’re going to move the needle on college access in the United States -- and we must, given our shifting demographics and the economic stakes -- then families have to get comfortable with personal investment in education.
That was certainly the story for my family many years ago. Having grown up in a small Midwest farming community with no resources for college, I took out more than $6,000 to cover my undergraduate education -- a sum that exceeds $41,000 in today’s dollars. And then there was the follow-on debt for graduate and professional education. It was scary, but it was also a privilege to use someone else’s money to improve my life. And that, fundamentally, is what students are doing when they use student loans to pursue an education.
If a “no loans” sentiment takes hold among students and policy makers, it will undermine access to college and make stories like mine less likely. It would reverse the democratization of higher education, devastating community colleges and public universities that are already stretched thin in their effort to serve a diversifying student body.
We badly need a more focused conversation about the right balance of taxpayer money, donor support and other university funds that can offset the cost to students. But in any scenario I can envision, short of creating a true K-16 entitlement, student loans are going to remain a necessary part of the mix.
If we cut off opportunity capital in the name of protecting students or taxpayers, we will end up with less opportunity. The relatively few families who can afford it will continue to buy high-quality, immersive education for their children. And others -- no matter how talented, no matter how driven -- will be left with meager options.
That would be a tragedy, not an improvement. The lamentations of the antidebt crowd assume that policy makers will ride to the rescue with new funding, but it won’t happen. Money is not that plentiful any more -- not from the states, and not from the federal government, despite what some of the presidential campaigns have promised. The students who benefit from higher education are going to remain personally invested, and there’s nothing regrettable about that.
We should stop scaring families with misleading tales of ruinous debt, and stop heeding pundits who would prefer to make education a rarefied luxury. When it comes to opening doors for our most vulnerable students, responsible borrowing is a solution, not a problem.
Shirley Ort is associate provost and director of the Office of Scholarships and Student Aid at University of North Carolina at Chapel Hill.
The U.S. Department of Education introduced a new rule on June 13 that could have an outsize negative impact on historically black colleges and universities.
And no one noticed.
As the former president of Bennett College -- the nation’s oldest historically black college for women -- I have been honored to play a role in increasing the immense opportunities HBCUs have provided to black students and other students of color over the past 150 years.
I have also witnessed the sharp increase of higher education costs, even as the importance of a good college degree continues to grow. Millennials will be burdened with more student loan debt than any other generation before them. According to The Wall Street Journal, cumulative outstanding student debt has surpassed an astounding $1 trillion. Yet with a decline in state and federal support -- states are now spending, on average, 20 percent less per student than they did in 2008, according to one think tank -- colleges and universities are more and more dependent on tuition for their financial stability.
Although HBCUs provide excellent academic opportunities for their students, they do not have the monetary security other colleges and universities enjoy. For example, top-rated HBCU Howard University maintains an endowment of about $660 million, while top-rated non-HBCU Harvard University has an endowment of $36 billion.
This fiscal contrast could become an immediate problem for HBCUs and their students in light of the Education Department’s new proposed rule.
The department recently announced the revised borrower defense to repayment regulation, which would allow students to sue their college or university and default on their loans if they think that the institution misled or defrauded them during the time they were enrolled. The original rule has been around for 20 years and provides essential protections for students who have been defrauded by their educators. The revised rule would greatly expand the criteria for students to sue their educators, with a far lower burden of proof on the student.
While I agree that students must be able to petition their educational providers for student loan forgiveness if they feel they have been defrauded, I worry about the unintended ramifications of such an enormously wide-open regulation. The Education Department has estimated it will have an economic impact of $4.2 billion in tuition repayments and other costs, but that could be just the tip of the iceberg. Institutions could also accumulate mounds of fees, as legal counsels attempt to wade through the vague and confusing regulations -- a cost HBCUs can ill afford.
The new rule has other costs and implications for HBCUs, as well, by requiring institutions to obtain new and costly letters of credit from lenders. HBCUs could be negatively impacted by “financial responsibility regulatory requirements,” which could threaten “their ability to continue their historic education mission,” according to a May 2016 letter from the United Negro College Fund.
My concerns mirror theirs.
According to a Gallup-Purdue University report, black students who graduated from historically black colleges felt more supported, both academically and emotionally, than their black peers at predominantly white institutions. Additionally, HBCUs graduate 18 percent of all African-American undergraduate students and 25 percent of all African-Americans in science, technology, engineering and math fields.
I had the privilege of working alongside many bright young women of color at Bennett who have graduated to become doctors, lawyers, teachers and engineers and have all made significant contributions to the American workforce. And I hope HBCUs can continue to produce such exemplary students of color.
Unfortunately, if this rule is implemented in its current form, opportunities for black students to receive the education they need to compete in the 21st century could decline. HBCUs would be forced to funnel their already limited monetary resources into unnecessary legal counsel instead of into the classrooms where they belong.
The proposed language in the rule is vague, difficult to understand and could cost taxpayers up to $43 billion over the next 10 years. The rule change was doubtless written in reaction to the May 2015 bankruptcy of Corinthian Colleges, a for-profit college system. The federal government may have to forgive millions of dollars in loans Corinthian students now owe. HBCUs are different from for-profit colleges, but the hastily written language of the rule makes no distinction among types of institutions.
We can all agree that students must have strong protections if they can prove they have been defrauded by their academic institution. Those protections already exist, and students should be better informed of their current rights and better empowered to pursue loan forgiveness in the case of legitimate grievances. But that shouldn’t come at the cost of financial instability, especially for HBCUs whose fiscal position is often not as strong as traditionally white institutions. Policy makers should revisit the rule and include HBCUs in the public comment process, which should be extended to take into account an examination of these issues.
I am hopeful that the Department of Education will consider these concerns and invite us to the discussion table before the comment period closes Aug. 1, and will do what’s in the best interest of students, educators and taxpayers. But in the meantime, it’s essential that our community makes our voices heard.
Julianne Malveaux is an author and economist and the founder of Economic Education. She is the former president of Bennett College, America’s oldest historically black college for women.
The old joke about studying English went, “Would you like fries with that major?” I haven’t heard that joke in years. Barista has replaced fast food worker as the career of choice for warning against the perils of majoring in English.
What are we to make of this new old joke about the English major? Why did barista replace fast food worker? The fact is that English majors are not particularly likely to end up as baristas or as workers in the food service industry in general. Plenty of data is available to disprove this idea, so what does its persistence mean? The English major barista is a myth in the sense of being untrue. It is also a myth in the deeper sense of that word: a story that a culture tells itself to explain wishes or fears. In this case, fears.
First things first. Data show that English majors do not tend to end up as baristas. Over each year, the U.S. Census Bureau conducts a detailed survey of about 1 percent of the national population. Called the American Community Survey, this census includes questions about age, educational attainment, field of degree and employment. Respondents to the survey cannot actually choose “barista” when reporting occupation, but they can choose the category “counter attendant, cafeteria, food concession and coffeehouse.” However, the number of people in this category is small when further segmented by field of degree.
A more reliable analysis groups this category along with several related ones, including bartenders, waiters, dishwashers and the like. That larger grouping does not literally count English majors who work as baristas, but it gets at the spirit of the claim, with greater statistical validity. If the destiny of the English major is service behind the coffee bar, then bartending, waiting tables or washing dishes cannot be far behind.
However, none of those food service jobs are the English major’s particular fate. According to the Census Bureau, graduates with an English degree have about a 4.9 percent chance of working in one of these food service occupations for some time between the ages of 22 and 26. By comparison, the average among all degree holders in this age group is about 3.5 percent. So English majors are only about 1.4 percentage points more likely to work in food service than the average for all degree holders.
When we look at mature workers, the data bear out a broader observation: majors in the humanities and social sciences take a little more time to find their career footing, but then they catch up with and sometimes exceed in salary earnings the graduates with more professional degrees. For degree holders ages 27 to 66, the percentage of graduates in English working in food service professions for some time during this 40-year period is 0.72 percent, or about one in 139 majors. Among all majors ages 27 to 66, the average is 0.48 percent. English remains higher than average, but not by much. The 0.24 point difference translates to an additional one in 417 chance of ending up at working in food service at some point between the ages of 27 and 66.
So where, in fact, do English majors end up working? The top occupations for English-degree holders ages 27 to 66 are elementary and middle school teachers, postsecondary teachers, and lawyers, judges, magistrates and other judicial workers. Indeed, English majors, who go on to a range of careers, are less likely to work in food service than in many highly skilled positions, including as chief executives and legislators (1.4 percent), physicians and surgeons (1.2 percent), or accountants and auditors (1.2 percent). Parents worried that their children will study English and end up as baristas should know that their sons and daughters are statistically more likely to end up as CEOs, doctors or accountants than behind the counter of a Starbucks.
Level of education and age, rather than choice of major, most predict work in food service. Between the ages of 22 and 26, people who do not report a baccalaureate degree have a somewhat higher percentage of food service work than English-degree holders: 5.68 percent vs. 4.88 percent. For mature workers, ages 27 to 66, the corollary numbers are 1.45 percent and 0.72 percent. For full-time mature workers, the difference a baccalaureate degree makes is particularly striking. English-degree holders ages 27 to 66 work full time in food service at a rate of 0.53 percent, those without a baccalaureate degree at 1.92 percent. Starbucks has made help with college degree completion a perk for its workers. If all those baristas had B.A.s in English, or in any degree, there would be no need for this program.
Of course, the English major as barista is also shorthand for a general belief that a degree in English leads to underemployment -- that is, to jobs that really do not require a college degree. A recent study shows that around 12 percent of recent college graduates ages 22 to 27 with a degree in English work in low-skilled service jobs. That is the same percent as for baccalaureate holders in this age group who majored in psychology and earth science, and 3.4 percentage points higher than the average for degree holders in general, which is 8.6 percent. Those percentages may be higher than we would like, but there’s nothing distinctive about English majors in them.
Fortunately, too, these percentages are for recent graduates; the same study shows that college graduates tend to mature out of these jobs. As we have seen, the English majors who do work in food service generally do so when they are young and as a first job -- a start, not an end. The coffeehouse is not their career.
To establish themselves in their careers, English majors need to show a bit more resourcefulness than do majors in narrowly preprofessional degrees. And year after year, that is exactly what real English majors do. They do not possess this resourcefulness in spite of their English degree or as a mere coincidence with it. Creative and independent thinkers are attracted to the English degree, and that course of study helps to develop their creativity and their initiative -- the same personal qualities that serve them so well in the working world after graduation.
So why the barista joke? It reflects negative attitudes about the English major itself rather than the realities of an English major’s likely employment. Since coffeehouses are places for reading, writing and talking, spending time in a coffeehouse is a lot like spending time in the study of English. Naturally enough, English majors like to hang out in them. STEM majors have their labs; English majors have their Starbucks. The joke about the English major barista implies, however, that unlike the science done in a lab, the study of English, whether pursued in coffeehouse or classroom, is without value. What better punishment for wasting this time than being sentenced to work at a coffeehouse rather than enjoying its pleasures, serving those who presumably chose some more valuable and lucrative major?
In this vengeful fantasy, moreover, the barista with an English B.A. contributes to the coffeehouse’s cultural sophistication, the human equivalent of its background jazz or pictures of Seattle circa 1971. The English major’s transformation into cultural wallpaper is part of the joke.
The English major makes an academic career out of studying literary culture or (still worse in the eyes of the major’s detractors) ordinary culture inflated into an academic subject. Having to work in a coffeehouse is punishment for that study, since students who are ambitious to become cultural elites instead find themselves in a lowly service industry, working in their local strip-mall Starbucks rather than sitting at a coffee bar in Florence. The particular name that Starbucks made famous for its workers -- “barista” -- along with all its pseudo-Italian terms, like “grande” for medium, is the foam on the Frappuccino. The joke implies that the job and its pretentious, pseudo-high culture name perfectly fit the empty pretensions of the major itself.
The Thin Bar
But this joke about frustrated aspiration is on us all. Consider the coffeehouse’s storied place in the history of European and Anglo-American modernity. Jürgen Habermas made famous the idea that the activities with which coffeehouses are still associated -- reading, writing, conversation -- made them nothing less than cradles of modern literature and democracy. The coffeehouse was a republic of letters, where literacy and the purchase of a cup of coffee were the only entry requirements to participation in literary and political worlds that had once been the exclusive province of courtly and hereditary elites. Coffeehouses were sometimes referred to in the 18th century as “penny universities.” (One still also had to be a man, although Habermas believes the ideals of the coffeehouse militated even against this restriction).
Your local coffee spot may seem a far cry from a cradle of western democracy or a “penny university.” Particularly with regard to Starbucks, the criticism of the coffeehouse today is that it’s a place of faux culture and shallow consumption, where the other side of high-priced coffee drinks is the exploitation of coffee farmers in the third world and of the company’s own workers behind the counter. From that point of view, Starbucks is just about making money. “Everything else,” as one Starbucks critic puts it, is “window dressing.” As part of that window dressing, the Starbucks barista both serves and reflects a world narrowed to maximized profit and empty consumption.
“Getting and spending, we lay waste our powers,” the poet Wordsworth wrote. Still: Starbucks promises something more than getting and spending. However much our local Starbucks is a place to grab coffee as we rush to work, or an embarrassment of ersatz culture, the success of the Starbucks brand demonstrates a yearning for more fulfilling cultural and communal spaces of the sort described by Habermas. Starbucks doesn’t just sell coffee; it sells the coffeehouse ideal. It offers reading and music suggestions, has printed literary quotes on coffee cups, and has asked its baristas to start discussions about race in America. The criticism that greeted the last initiative is telling. Starbucks was seen as too corporate to serve as a place for genuine cultural or political exchange, however much it seems to promise it.
The fast-food joke consigned the English major to a low-paying and unfulfilling job. The barista joke consigns the English major to a low-paying and unfulfilling job that remains tantalizing close to a more fulfilling coffeehouse ideal. To the extent that we also want that ideal, we’re that close, too. We, too, are attracted to the coffeehouse image of a richer cultural and communal life, even if that image promises more than harried working lives and corporate marketing can deliver. A thin bar separates the cultural aspirations, and disappointments, of Starbucks workers and consumers.
A similarly thin bar separates worker and consumer in terms of a feared economic decline. There was a time when we might have celebrated the English major’s drive to explore self and world in college, or as part of a career trajectory that involved some time for similar self-development and exploration of opportunities, before rushing headlong into a career. There was a time when we laughed at hearing the just-graduated Dustin Hoffman advised in The Graduate to stake his future on plastics. And there was a time when we understood that English majors, like other majors in the liberal arts, end up with far more than a salary -- they develop the sense of ethics, history and culture, and the habits of open and reasoned deliberation, that the coffeehouse ideal represents and that are essential to functioning democracies, not to mention to lives well lived.
Today, however, many people laugh at someone who seems unwilling to turn a college education into job training for the industry du jour in order to secure the highest-paying job straight out of college. English majors achieve successful careers, as the data show. That we consign them, in the myth of the English major barista, to a permanent life in food service says less about them and more about us -- about how afraid we have become of defying the market imperative to maximize profit, the single force, apparently, by which we are now supposed to guide our lives.
This fear is reasonable -- stagnant wages, the erosion of unions, the growing use of contract and part-time labor to replace full-time jobs, the increasing gap between rich and poor, and insufficiently regulated financial markets all contribute to the insecurity of middle-class life. For college students in particular, the withdrawal of states from the public funding of higher education, combined with rising tuition, makes any decisions seem risky if they don’t, as the saying goes now, make college an effective return on investment. But the fear is more than that. It is as if any defiance of profit maximization must be met with punishment: the condemnation to a life serving coffee.
We will only really dispel the myth of the English major barista when we confront head-on the structural economic problems and the narrow market ideology that drive the fear behind it. Meanwhile, in their own refusal to succumb to this fear, English majors can be confident they'll do fine spending some time in coffeehouses -- whichever side of the bar they’re on.
Robert Matz is a professor of English and senior associate dean in the College of Humanities and Social Sciences at George Mason University.