Last month, while looking over thousands of listings for forthcoming books in dozens of university-press catalogs for this fall, I flagged 300 titles for further consideration as possible topics for future columns. Within that selection, a few clusters of books seemed to reflect trends, or interesting coincidences at least, and I noted a few of them here.
That survey, however unscientific and incomplete, was fairly well received. Here’s part two. As in the first installment, material in quotation marks is from catalog descriptions of the books. I’ve been sparing with links, but more information on each title is available from its publisher’s website, easily located via the Association of American University Presses directory.
Scholarly publishers might count as pioneers of what Jacob H. Rooksby calls The Branding of the American Mind: How Universities Capture, Manage and Monetize Intellectual Property and Why It Matters (Johns Hopkins, University Press, October), although the aggregate profits from every monograph ever published must be small change compared to one good research partnership with Big Pharma. Rooksby explores “higher education’s love affair with intellectual property itself, in all its dimensions” and challenges “the industry’s unquestioned and growing embrace of intellectual property from the perspective of research in law, higher education and the social sciences.” (Sobering thought: In this context, “the industry” refers to higher education.)
Making intellectual property more profitable is Fredrik Erixon and Björn Weigel’s concern in The Innovation Illusion: How So Little Is Created by So Many Working So Hard (Yale University Press, October), which treats “existing government regulations and corporate practices” as a menace to economic growth and prosperity: “Capitalism, they argue, has lost its mojo.”
If so, Google is undoubtedly developing an algorithm to look for it. At least three books on Big Data try to chart its impact on research, policy and the way we live now. Contributors to Big Data Is Not a Monolith, edited by Cassidy R. Sugimoto, Hamid R. Ekbia and Michael Mattioli (The MIT Press, October), assess the scope and heterogeneity of practices and processes subsumed under that heading. Roberto Simanowski’s Data Love: The Seduction and Betrayal of Digital Technologies (Columbia University Press, September) warns of the codependent relationship between “algorithmic analysis and data mining,” on the one hand, and “those who -- out of stinginess, convenience, ignorance, narcissism or passion -- contribute to the amassing of ever more data about their lives, leading to the statistical evaluation and individual profiling of their selves.” Christine L. Borgman focuses on the implications of data mining for scholarly research in Big Data, Little Data, No Data: Scholarship in the Networked World (The MIT Press, September), first published last year and now appearing in paperback. While “having the right data is usually better than having more data” and “little data can be just as valuable as big data,” the future of scholarship demands “massive investment in knowledge infrastructures,” whatever the scale of data involved.
Events in real time occasionally rush ahead of the publishing schedule. Several months ago David Owen advised the British public to “vote leave” in The U.K.’s In-Out Referendum: E.U. Foreign and Defence Policy Reform (Haus Publishing, distributed by the University of Chicago Press) but it reaches the American market only this month. Christopher Baker-Beall analyzes The European Union’s Fight Against Terrorism: Discourse, Policies, Identity (Manchester University Press, September) with an eye to “the wider societal impact of the ‘fight against terrorism’ discourse” in the European Union and “the various ways in which this policy is contributing to the ‘securitization’ of social and political life within Europe.” Recent developments suggest this will be a growing field of study.
The E.U.’s days are numbered, according to Larry Elliott and Dan Atkinson, because Europe Isn’t Working (Yale University Press, August). Or rather, more precisely, the euro isn’t. The currency “has failed to deliver on its promise of more jobs, more growth and greater equality,” and the E.U.’s “current policy of kicking the can down the road and hoping that something will turn up” can’t continue forever. A less fatalistic account of The Euro and the Battle of Ideas by Markus K. Brunnermeier et al. (Princeton University Press, August) traces the currency’s vicissitudes to “the philosophical differences between the founding countries of the Eurozone, particularly Germany and France.” But “these seemingly incompatible differences can be reconciled to ensure Europe’s survival.”
Meanwhile, on this side of the Atlantic, it’s time to start phasing out paper money, argues Kenneth S. Rogoff in The Curse of Cash (Princeton, August). The bigger denominations ($100 and up) enable “tax evasion, corruption, terrorism, the drug trade, human trafficking and the rest of a massive global underground economy” and have also “paralyzed monetary policy in virtually every advanced economy.” Small bills and coins are not such a problem, but the Franklins (and larger) could be replaced by a state-backed digital currency. For now, Arvind Narayanan et al. reveal “everything you need to know about the new global money for the internet age” in Bitcoin and Cryptocurrency Technologies: A Comprehensive Introduction (Princeton, August), complete with “an accompanying website that includes instructional videos for each chapter, homework problems, programming assignments and lecture slides.” Perfectly honest and law-abiding people will find the book of interest, but it seems like a must-read for anyone with a professional commitment to tax evasion, the drug trade and the like.
As it happens, the fall brings a bumper crop of scholarship on crime, punishment and policing, at varying levels of abstraction and grit. Andrew Millie’s Philosophical Criminology (Policy Press, distributed by the University of Chicago Press, November) is described as “the first book to foreground this emerging field” -- which it certainly is not. Whatever the contribution of the book itself, hype at this level counts as a species of counterfeiting. The anthropologists Jean Comaroff and John L. Comaroff compare developments in South Africa, the United States and the United Kingdom in The Truth About Crime: Sovereignty, Knowledge, Social Order (University of Chicago, December), while the contributors to Accusation: Creating Criminals, edited by George Pavlich and Matthew P. Unger (University of British Columbia, October) consider “the founding role that accusation plays in creating potential criminals.” Here we find another large claim: “his book launches an important new field of inquiry.” As an armchair criminologist, I am curious to see learn this differs from the venerable and well-worked field of labeling theory.
Closer to the street, Michael D. White and Henry F. Fradella consider Stop and Frisk: The Use and Abuse of a Controversial Policing Tactic (NYU Press, October) -- a practice much in the headlines in recent years, usually in connection with the issue of racial profiling. Their conclusions -- that “stop and frisk did not contribute as greatly to the drop in New York’s crime rates, as many proponents … have argued,” but also that “it can be judiciously used to help deter crime in a way that respects the rights and needs of citizens” -- are sure to provoke arguments from a variety of perspectives.
Forrest Stuart was stopped on the street for questioning 14 times in the first year of field work for Down, Out and Under: Arrest Policing and Everyday Life in Skid Row (University of Chicago Press, August), “often for doing little more than standing there.” He finds that the “distrust between police and the residents of disadvantaged neighborhoods” is “a tragedy built on mistakes and misplaced priorities more than on heroes and villains”; parties on both sides “are genuinely trying to do the right thing, yet too often come up short.”
Another ethnographic dispatch from the extremes of poverty, Christopher P. Dum’s Exiled in America: Life on the Margins in a Residential Motel (Columbia University Press, September) reports on the “squalid, unsafe and demeaning circumstances” of the housing of last resort “for many vulnerable Americans -- released prisoners, people with disabilities or mental illness, struggling addicts, the recently homeless, and the working poor.” The catalog entry for the book doesn’t mention it, but you feel the police presence all the same.
The overcrowding of American prisons is often explained as the byproduct of draconian mandatory sentencing laws, but Wisconsin Sentencing in the Tough-on-Crime Era: How Judges Retained Power and Why Mass Incarceration Happened Anyway by Michael M. O’Hear (Wisconsin, January) argues even in “a state where judges have considerable discretion in sentencing … the prison population has ballooned anyway, increasing nearly tenfold over forty years.” Over the same period, long-term solitary confinement has grown increasingly commonplace, as discussed in a column from six months ago concerning an anthology of writings by scholars, activists and prisoners. Keramet Reiter offers a case study in 23/7: Pelican Bay Prison and the Rise of Long-Term Solitary Confinement (Yale University Press, October). The title refers to how many hours a day prisoners spend “in featureless cells, with no visitors or human contact for years on end, and they are held entirely at administrators’ discretion.”
The practice signals that prison authorities have not just abandoned the idea of reformation but moved on to something more severe: a clear willingness to destroy prisoners’ minds. By contrast, Daniel Karpowitz’s College in Prison: Reading in an Age of Mass Incarceration (Rutgers University Press, February) describes Bard College’s program offering undergraduate education to New York state prisoners. The book serves as “a study in how institutions can be reimagined and reformed in order to give people from all walks of life a chance to enrich their minds and expand their opportunities” while making “a powerful case for why liberal arts education is still vital to the future of democracy in the United States.”
Daniel LaChance’s Executing Freedom: The Cultural Life of Capital Punishment in the United States (University of Chicago Press, October) asks why, by “the mid-1990s, as public trust in big government was near an all-time low,” a staggering 80 percent of Americans supported the death penalty. “Why did people who didn’t trust government to regulate the economy or provide daily services nonetheless believe that it should have the power to put its citizens to death?” The question implies a belief in the consistency and coherence of public opinion that is either naïve or rhetorical; in any case, the author maintains that “the height of 1970s disillusion” led to a belief in “the simplicity and moral power of the death penalty” as “a potent symbol for many Americans of what government could do” -- and, presumably, get right. That confidence has been shaken by a long string of reversals of verdict in recent years, which “could prove [the death penalty’s] eventual undoing in the United States.”
Given the brazen, methodical and massively destructive corruption leading to the near collapse of the world’s financial system eight years ago, Mary Breiner Ramirez and Steven A. Ramirez call for a new variety of capital punishment in The Case for the Corporate Death Penalty: Restoring Law and Order on Wall Street (NYU Press, January). “Despite overwhelming proof of wide-ranging and large-scale fraud on Wall Street before, during and after the crisis,” the government’s response amounted to “fines that essentially punished innocent shareholders instead of senior leaders at the megabanks.” Crony capitalism and white-collar crime will continue until the danger of corporate conviction -- having the company’s charter revoked, i.e., putting it out of business -- is credibly on the table.
In effect, if corporations enjoy the legal protection granted them by the Supreme Court’s dubious but effective interpretation of the 14th Amendment, they also should face the possibility of being put to death -- after due process, of course. And fair enough, although the last word here comes from that bumper sticker saying “I’ll believe corporations are people when Texas executes one.”
The most significant challenge facing higher education today is our growing economic segregation. College completion rates for those at the lowest socioeconomic rungs continue to lag far behind those of their wealthier peers, not only due to diminished financial resources but also because of a lack of social and cultural capital. Redressing this phenomenon will require offering an education that prepares each and every student for success in work and life, while inspiring them to take seriously their social responsibilities in a society plagued by persistent inequities.
In fact, the board of directors of the Association of American Colleges and Universities, where I serve as president, expanded the organization’s mission in 2012 to embrace both inclusive excellence and liberal education as the foundation for institutional purpose and educational practice. The addition of inclusive excellence as one of AAC&U’s foundational principles reflects the ideal that access to educational excellence for all students -- not just the privileged -- is essential not only for our nation’s economy but, more important, for our democracy. Democracy cannot flourish in a nation divided into haves and have nots.
The equity imperative as an essential component of educating for democracy has been at the forefront of my mind during the past few weeks of nonstop coverage of the Republican and Democratic National Conventions. I have been particularly focused on the potential impact of various higher education policy proposals on AAC&U’s objective of advancing a public-spirited vision of inclusive excellence as inextricably linked to liberal education.
While higher education issues were pretty much absent from the Republican convention speeches, an earlier proposal by Donald Trump, developed by Sam Clovis, his educational policy adviser, to restrict eligibility for student loans in order to make it more difficult for those at “nonelite colleges” to major in the liberal arts previously caught my attention. Indeed, I am convinced that, if enacted, it would risk exacerbating what Thomas Jefferson termed an “unnatural aristocracy,” where only the wealthy gain the benefits of the kind of broad and engaged liberal education that Clovis himself insists is the absolute foundation for success in life.
Trump’s proposal makes at least two serious errors about the value of a college degree in today’s world. It assumes, first, that one’s undergraduate major is all that matters and, second, that only some majors will prepare students for success in the workplace. The evidence from AAC&U’s own surveys of employers, and from many economists, suggests that this is simply not the case. As noted in the title of our 2013 report of employers’ views, “It Takes More Than a Major,” more than 90 percent of employers agree that “a graduate’s ability to think critically, communicate clearly and solve complex problems is more important than their undergraduate major.” Students can develop such cross-cutting skills in a wide variety of chosen disciplines, if the courses are well designed and integrated within robust, problem-based general education programs.
A student’s undergraduate experience, and how well the experience advances critical learning outcomes, is what matters most, with 80 percent of employers agreeing that all students need a strong foundation in the liberal arts and sciences. A liberal education fosters the capacity to write, speak and think with precision, coherence and clarity; to propose, construct and evaluate arguments; and to anticipate and respond to objections. And it offers what employers value the most: the ability to apply knowledge in real-world settings, to engage in ethical decision making and to work in teams on solving unscripted problems with people whose views differ from one’s own. In a globally interdependent yet multicultural world, it is precisely because employers place a particular premium on innovation in response to rapid change that they emphasize students’ experiences with diverse populations, rather than narrow technical training.
The data confirm what we already know: students in all undergraduate majors can and should gain the outcomes of a broad liberal education. Therefore, we need to be vigilant in rebutting accusations of irrelevance and illegitimacy leveled specifically at the liberal arts and sciences and to recognize those charges for what they are: collusion in the growth of an intellectual oligarchy in which only the very richest and most prestigious institutions preserve access to the liberal arts traditions. Trump’s ostensible presumption that college is only about workforce training is dangerous to our democratic future.
Of course, it is unclear whether a proposal to use student loans to steer students away from certain majors could be implemented, given the challenges of predicting career trajectories based on majors and types of institutions. (After all, I was a philosophy major who began at a community college under funds from the Comprehensive Employment Training Act, Pell Grants and Perkins Loans.)
Still, in order to restore public trust in higher education and destabilize the cultural attitudes at the basis of Trump’s policy proposal, we need to demonstrate in a more compelling way to those outside of the academy, Democrats and Republicans alike, the extent to which we actually are teaching students 21st-century skills, preparing them to solve our most pressing global, national and local problems within the context of the workforce, not apart from it. To do so, our institutions of higher education must come together to engage in an honest assessment of our effectiveness and undertake a collaborative exchange of best practices. Our shared commitments to equity, democratic and economic vitality, and inclusive excellence demand nothing less.
Lynn Pasquerella is president of the Association of American Colleges and Universities.
Lincoln University -- a historically black university located in Jefferson City, Mo. -- suspended its major in history on its 150th anniversary. Explaining why that step was necessary, the president of the university emphasized, “We must make decisions like these as we look toward the future and the needs of the changing workforce.” Embedded within that statement is a declaration about higher education and its purpose: higher education should make good, high-paid workers. We should step back and ask whether this is really what we want from higher education.
Since I took my first academic position in 2010, I have continually heard in the news media, from visiting speakers and many other people that transforming students into employees is the purpose of higher education. Whenever I hear this, I cannot help but recall one particular graduate seminar when we discussed the writings of Marxist Louis Althusser. The discussion turned to higher education, and some people in the class claimed higher education was little more than part of a plot to provide good and obedient workers to the bourgeoisie. At the time, I thought that was overly reductive. I mean, we were talking about the supposed conspiracy of the bourgeoisie in class at an institution of higher education; surely this was not part of the plan.
Once I got my first academic job, however, I learned that this really was the perennial question in higher education. What should our general education curriculum look like? On which majors should we focus our resources? The answer was always put in the form of another question -- what do employers want from our graduates?
Perhaps because of the rising costs of higher education, politicians have increasingly said that the point of higher education is for students to make lots of money in their chosen careers. Is that what we want from higher education? Maybe a better question would be is that the only thing we want from higher education?
In her recent article in The American Historian, Nancy F. Cott indicates it is hard for humanities degrees -- like history -- to compete with degrees related to engineering if the only significant variable is potential earnings. One study found that throughout their careers, engineers consistently earned more than graduates in the humanities. But then, not everyone wants to be an engineer. As Cott phrased it, neither would we really want “to see an educated world populated by engineers only.” The fact is people educated in the humanities go on to important, although often not quite as lucrative, careers in education, government, law and a host of other interesting and relevant occupations.
Since students enter into significant debt to earn their diplomas, it seems reasonable for students to expect some return on their often significant investments. I hope as we review what we value in education, however, we do not simply ask which majors lead to the most lucrative careers.
Du Bois and Shaping Lives in the Present
What is higher education for? Should it exist solely for the purpose of manufacturing workers who make the greatest amount of money? It’s not a new question. It’s one that the renowned African-American historian W. E. B. Du Bois wrestled with in his speech commemorating Lincoln University’s 75th anniversary in 1941. He worried that the temptation would “come and recur to make an institution like this, a means of earning a living or of adding to income rather than an institution of learning.” Du Bois believed the kind of students Lincoln produced would end up changing the world for the better -- that it would be Lincoln students who would “show the majority the way of life.” Not from privileged and “powerful groups which from time to time rule the world have come salvation and culture,” he said, “but from the still small voice of the oppressed and the determined who knew more than to die and plan more than mere survival.” In short, Du Bois hoped that Lincoln would become “a center where the cultural outlook of this country is to be changed and uplifted and helped in the reconstruction of the world.”
Why did Du Bois believe that students at a university like Lincoln would be so influential? Du Bois recognized the power of history to shape lives in the present, and he rightly believed that this nation needed more diverse students if the status quo was ever going to change. In Du Bois’s day, history was being used to justify violence against African-Americans. In 1915, the original version of The Birth of a Nation premiered in the United States. In that movie, President Woodrow Wilson’s book History of the American People was regularly quoted. Audiences around the country saw Wilson declare through this movie that Reconstruction had been a misguided failure during which “the negroes were the office holders, men who knew none of the uses of authority, except its insolences.”
Wilson and many other people in the academy were part of what eventually became known as the Dunning School of Reconstruction History. For William Dunning, the historian for whom the broader school was named, Reconstruction was a failure because great numbers of the recently emancipated slaves “gave themselves up to testing their freedom. They wandered aimless but happy through the country.”
According to Dunning, it was Southern whites who “devoted themselves with desperate energy to the procurement of what must sustain the life of both themselves and their former slaves.” Lesson learned: black political participation meant misery for all, but exclusive white control meant the best for both black and white Southerners. The Dunning School of Reconstruction History justified the exclusion of black people from politics, and it implicitly justified the violence used to maintain that exclusion.
W. E. B. Du Bois labored to contradict those impressions. In his now widely read TheSouls of Black Folks, Du Bois argued that it was not the irresponsible silliness of black people that doomed Reconstruction but rather the impossible problems facing the recently freed slaves. Reflecting upon the failure of efforts to make Southern African-Americans truly free, Du Bois noted that the Freedmen’s Bureau could not even “begin the establishment of goodwill between ex-masters and freedmen,” and perhaps most important, it could not “carry out to any considerable extent its implied promises to furnish the freedmen with land.”
Adding to the impossible challenge was the fact that much of the legislation created during Reconstruction was intended to punish the white South rather than empower the recently emancipated. As viewed by Du Bois, black equality was a cudgel used to punish the rebellious South rather than a goal in and of itself. Without any real support for black equality in either the North or the South, how could we expect anything but failure from Reconstruction? Because of those failures, black people suffered under the weight of white supremacy.
White historians largely ignored Du Bois’s conclusions for years; it was not until higher education expanded to include a wide swath of the American population -- due in large part to the GI Bill -- that more historians came to accept what he had long argued. Today, the vast majority of historians of Reconstruction accept his premise that many capable black politicians participated in the Reconstruction. Many worked to expand roads and education to include a plurality of the Southern population. At the time, their opponents saw this as waste and corruption, but the vision of those black politicians more closely aligned with our own expectations. We -- like they -- expect our governments to maintain public roads and public education. History looks different from the bottom up.
Reversing Dominant Narratives
Du Bois did not mention the degree in history specifically in his speech in 1941, but his life’s work demonstrated the importance he placed upon the historical imagination. He correctly predicted that making the academy more diverse would change the world for the better. History has been used to justify white supremacy, and it has been used to undermine it as well. As the population of historians changed, so too has the accepted narrative of the academy. That’s why Du Bois did not ask what majors earned the most money upon graduation but had a loftier vision for Lincoln’s future. America needed impassioned graduates from schools like Lincoln. Someone had to help reverse the dominant narratives prevalent in 1941 about black inferiority.
On Lincoln University’s 75th anniversary, Du Bois provided a powerful argument in favor of empowering Lincoln’s students to go and change the world. I fear that the end of history at Lincoln University means students will have less ability to do so in the future. That saddens me, because our national history is particularly relevant today. In 2016, a reinterpretation of The Birth of a Nation is set to debut and likely make radically different claims than its 1915 namesake. Why did the creators of this new movie -- which will document the slave rebellion led by Nat Turner -- give it that name? In 2016, some people have suggested that the civil rights movement of the 1960s was relatively short and its goals were largely accomplished. How then do we explain the emergence of the Black Lives Matter movement? Do these protesters fail to understand just how racially progressive our country has become? In 2016, some politicians have suggested that the United States is a nation founded by white ideas -- or “Western civilization” -- and people of color are guests. Are they right?
Our history as a nation has been used to answer those kinds of questions, and someone is going to be answering these questions in the future. In addition to asking what employers want our graduates to do, we should also ask whom we want to answer such important questions.
Graduates -- whether in the humanities, sciences or engineering -- will continue to get relevant and interesting jobs. Some will get paid more than others. In finding the right major, students will have to make strategic choices about what they want for their lives. Having spoken with many students, I know many are not so single-mindedly focused upon profit. Many have more philanthropic purposes in mind for their education. By so circumscribing the range of possibilities, however, we are creating a future in which Lincoln’s graduates will be able to get jobs but maybe not make history.
J. Mark Leslie is an associate professor of history at Lincoln University.
In this hypothetical case study, Barbara McFadden Allen, Ruth Watkins and Robin Kaler explain how college leaders can -- and must -- surround themselves with a team of wise people with competing viewpoints.
As students prepare to return to school for the coming academic year, there are 65,000 high school seniors who lack a clear path to college because they are undocumented. While undocumented students have access to K-12 public education, their options abruptly become scarce when they turn 18: in addition to the barriers that many low-income students face, these students must navigate a higher education system that excludes them, either explicitly or de facto.
One glaring obstacle is that undocumented students are ineligible for federal financial aid. Another is that access to public institutions, usually the most affordable option, varies by state. While some states offer resident tuition and state financial aid, others prohibit undocumented students from enrolling altogether. Other states fall in the middle of the spectrum, providing in-state rates to students with Deferred Action for Childhood Arrivals at some public universities. (A federal administrative policy implemented in 2012, DACA provides Social Security Numbers and the eligibility to work and drive to individuals who arrived in the United States as children and meet certain age and education requirements. However, it does not provide a path to citizenship. Since its implementation, roughly 700,000 undocumented youth and young adults have received DACA status.)
Given this landscape, private colleges and universities have an opportunity to be key players in promoting higher education access for undocumented students nationwide. Most, though not all, selective private institutions already accept undocumented or “DACAmented” students, but as of now, information and resources for undocumented applicants are difficult to find. So difficult, in fact, that students have taken the issue into their own hands: a group of undergraduates at Harvard University started a nonprofit, Higher Dreams, to serve as a “comprehensive resource” for undocumented applicants interested in applying to private colleges and universities. Sarahi Espinoza Salamanca, a student from California, created the DREAMer’s Roadmap app to help undocumented students find scholarships for college.
Meanwhile, institutions themselves should do their part and take a far more deliberate approach: there is a great difference between accepting students and making college truly accessible. If they are serious about their stated commitments to access, opportunity, and diversity, they should recognize their potential to make a difference. They should anticipate and welcome applications from undocumented students, actively make an effort to understand their circumstances and specific needs, and adopt policies that follow through on meeting those needs.
Colleges can take several steps. First, they can educate admissions staff so that potential applicants who are undocumented will receive accurate information. Better yet, they can hire or designate a staff person to specialize in working with undocumented students. Unfortunately, that is not the norm; many admissions personnel, though well meaning, are not equipped to answer questions from undocumented applicants. Staff education is a basic and important place to start.
Another key to increasing access is changing admissions and financial-aid policies to reflect the reality of undocumented students’ lives. Many independent colleges count them as international applicants -- a highly competitive pool. Accepted students are often charged international tuition rates, which are prohibitively high even for middle-income families, and they are only eligible for competitive merit scholarships. Implicit in this policy is the idea that undocumented students are more aptly compared to international students than to American citizens, which is patently inaccurate. Having attended American high schools and spent a significant, formative part of their lives in the United States, they should be considered within that context, not judged alongside international applicants whose experiences are virtually incomparable.
Experiential similarities and moral arguments aside, students with DACA work and have Social Security numbers -- like their American peers, and unlike international students. With or without DACA, they pay taxes. The only practical difference between them and their citizen peers, then, from an admissions perspective, is their lack of access to federal aid or loans. Admissions and financial-aid policies should reflect that reality and consider undocumented students as domestic applicants, eligible for aid based on demonstrated need.
Finally, institutions should publicize their commitment to working with undocumented students, who too often go unacknowledged. If a college or university already accepts undocumented students, it should shift from a don’t ask, don’t tell mentality to one of active inclusion. Some institutions have dedicated admissions pages specifically for undocumented students that include FAQs, resources and contacts. Publicizing such information is a small but meaningful act: it provides targeted support, which undocumented students so rarely get, and makes a statement that they are truly welcome.
In essence, it is simply not enough for colleges and universities to accept undocumented students tacitly and passively. It is not enough to accept undocumented students but then charge exorbitant tuition. If an institution welcomes undocumented students in principle by allowing them to apply, then those students deserve the same level of targeted support that American citizens receive when it comes to the application process and financial aid -- not to mention student services once in college.
Some institutions are already leading the way. Oberlin College, for example, encourages undocumented students to apply, counts them as domestic applicants and provides need-based aid. Emory University recently adopted the same policy for students with DACA. (The state of Georgia, meanwhile, legally blocks undocumented students from enrolling in its top five state schools, so Emory has made a statement by providing an alternative option.) Tufts University “proactively and openly” recruits and provides aid for undocumented students, with or without DACA, and Swarthmore College rolled out a similar policy this spring, arguing that as a campus that values “different viewpoints, identities and histories among our students,” it invites all students, regardless of citizenship status, to apply.
The intentional nature of these policies and the tangible changes to the institutions’ recruitment and financial-aid strategies are what make their statements more than just lip service. Many more institutions should follow suit.
Lily McKeage is a recent graduate of the Harvard Graduate School of Education and program director at YES Scholars in New York City.
Over the weekend I went through the fall 2016 catalog of every publisher belonging to the Association of American University Presses. Or at least I tried -- a number of fall catalogs have not been released yet, or else the publishers have hidden the PDFs on their websites with inexplicable cunning. (It seems as if savvy publicists would insist that catalogs be featured so prominently on the homepage that it’s almost impossible to overlook them. Perhaps half my time went to playing “Where’s Waldo?” so evidently not.) A few sites hadn’t been updated in at least a year. At one of them, the most recent catalog is from 2012, although the press itself seems still to be in existence. Let’s just hope everyone there is OK.
After assembling roughly 70 catalogs, I began to cull a list of books to consider for this column in the months ahead, which now runs to 400 titles, give or take a few, with more to be added as the search for Waldo continues. When you take an overview of a whole season’s worth of university-press output in one marathon survey, you can detect certain patterns or themes. A monograph on the white-power music underground? Duly noted. A second one, publishing a month later? That is a bit more striking. (The journalistic rule of thumb is that three makes a trend; for now, we’re left with a menacing coincidence.)
Some of the convergences seemed to merit notice, even in advance of the books themselves being available. Here are a few topical clusters that readers may find of interest. The text below in quotation marks after each book comes from the publisher’s description of it, unless otherwise specified. I have been sparing about the use of links, but more information on the books and authors can be readily found online.
“Whither democracy?” seems like an apt characterization of quite a few titles appearing this autumn and early winter. Last year, Jennifer L. Hochschild and Katherine Levine Einstein asked, Do Facts Matter? Information and Misinformation in American Politics, published by the University of Oklahoma Press and out in paperback this month, concluding that “citizens’ inability or unwillingness to use the facts they know in their political decision making may be frustrating,” but the real danger comes from “their acquisition and use of incorrect ‘knowledge’” put out by unscrupulous “political elites.” By contrast, James E. Campbell’s Polarized: Making Sense of a Divided America (Princeton University Press, July) maintains that if the two major parties are “now ideologically distant from each other and about equally distant from the political center” it’s because “American politics became highly polarized from the bottom up, not the top down, and this began much earlier than often thought,” meaning the 1960s.
Frances E. Lee sets the date later, and the locus of polarization higher in the body politic, in Insecure Majorities: Congress and the Perpetual Campaign (University of Chicago Press, September). She sees developments in the 1980s unleashing “competition for control of the government [that] drives members of both parties to participate in actions that promote their own party’s image and undercut that of the opposition, including the perpetual hunt for issues that can score political points by putting the opposing party on the wrong side of public opinion.”
Democracy: A Case Study by David A. Moss (Harvard University Press, January 2017) takes fierce partisanship as a given in American political life -- not a bug but a feature -- and recounts and analyzes 19 episodes of conflict, from the Constitutional Convention onward. Wasting no time in registering his dissent, the libertarian philosopher Jason Brennan comes out Against Democracy (Princeton, August) on the grounds that competent governance requires rational and informed decision making, while “political participation and democratic deliberation actually tend to make people worse -- more irrational, biased and mean.” The alternative he proposes is “epistocracy”: rule by the knowledgeable. Good luck with that! Reaching that utopia from here will be quite an adventure, especially given that some voters regard “irrational, biased and mean” as qualifications for office.
Fall, when the current election cycle ends, is also be the season of books on the Anthropocene -- the idea that human impact on the environment has been so pronounced that we must define a whole phase of planetary history around it. There is an entry for the term in Fueling Culture: 101 Words for Energy and Environment (Fordham University Press, January), and it appears in the title of at least three books: one from Monthly Review Press (distributed by NYU Press) in September and one each from Princeton and Transcript Verlag (distributed by Columbia University Press) in November. Stacy Alaimo’s Exposed: Environmental Politics and Pleasures in Posthuman Times (University of Minnesota Press, October) opens with the statement “The Anthropocene is no time to set things straight.” (The author calls for “a material feminist posthumanism,” and it sounds like she draws on queer theory as well, so chances are “straight” is an overdetermined word choice.)
The neologism is tweaked in Staying With the Trouble: Making Kin in the Chthulucene (Duke University Press, September) by Donna J. Haraway, who “eschews referring to our current epoch as the Anthropocene, preferring to conceptualize it as what she calls the Chthulucene, as it more aptly and fully describes our epoch as one in which the human and nonhuman are inextricably linked in tentacular practices.” Someone in a position to know tells me that Haraway derives her term from “chthonic” (referring to the subterranean) rather than Cthulhu, the unspeakable ancient demigod of H. P. Lovecraft’s horror fiction. Maybe so, but the reference to tentacles suggests otherwise.
A couple of titles from Columbia University Press try to find a silver lining in the clouds of Anthropocene smog -- or at least to start dispersing them before it’s too late. Michael E. Mann and Tom Toles pool their skills as atmospheric scientist and Pulitzer-winning cartoonist (respectively) in The Madhouse Effect: How Climate Change Denial Is Threatening Our Planet, Destroying Our Politics and Driving Us Crazy (September), which satirizes “the intellectual pretzels into which denialists must twist logic to explain away the clear evidence that man-made activity has changed our climate.” Despite its seemingly monitory title, Geoffrey Heal’s Endangered Economies: How the Neglect of Nature Threatens Our Prosperity (December) is actually an argument for “conserving nature and boosting economic growth” as mutually compatible goals.
If so, it will be necessary to counter the effects of chickenization -- which, it turns out, is U.S. Department of Agriculture slang for “the transformation of all farm animal production” along factory lines, as described in Ellen K. Silbergeld’s Chickenizing Farms and Food: How Industrial Meat Production Endangers Workers, Animals and Consumers (Johns Hopkins University Press, September). Tiago Saraiva shows that the Germans began moving in the same direction, under more sinister auspices, in Fascist Pigs: Technoscientific Organisms and the History of Fascism (The MIT Press, September): “specially bred wheat and pigs became important elements in the institutionalization and expansion of fascist regimes …. Pigs that didn’t efficiently convert German-grown potatoes into pork and lard were eliminated.” A different sociopolitical matrix governs the contemporary American “pasture-raised pork market,” of which Brad Weiss offers an ethnographic account in Real Pigs: Shifting Values in the Field of Local Pork (Duke University Press, August).
And finally -- for this week, anyway -- there is the ecological and biomedical impact of the free-ranging creatures described in Peter P. Marra and Chris Santella’s Cat Wars: The Devastating Consequences of a Cuddly Killer (Princeton, September). Besides the fact that cats kill “birds and other animals by the billions” in the United States, the authors warn of “the little-known but potentially devastating public health consequences of rabies and parasitic Toxoplasma passing from cats to humans at rising rates.” The authors also maintain that “a small but vocal minority of cat advocates has campaigned successfully for no action in much the same way that special interest groups have stymied attempts to curtail smoking and climate change.” I write this while wearing a T-shirt that reads “Crazy Cat Guy” but will be the first to agree that the problem here is primarily human. There’s a reason it’s called the Anthropocene and not the Felinocene.
A number of other themes and topics from university-press fall books offering might bear mentioning in another column, later this summer. With luck, the pool of candidates will grow in the meantime; we’ll see if any new trends crystallize out in the process.
College is not free, and never will be. Someone is always paying -- taxpayers, private donors, students or some mix of the three. That obvious truth is missing from much of our political debate and the growing panic over student loans, which casts education debt as a tragedy rather than an investment. The hardening rhetoric against student loans threatens to undermine national success in broadening access to higher education, discouraging the very students we need.
This may sound strange coming from someone whose signature career achievement is a no-loans aid program. The whole idea behind the Carolina Covenant, which we launched at the University of North Carolina at Chapel Hill in 2003, was to assuage growing worry about student debt by eliminating loans for our lowest-income students.
But if we’re going to put higher education within reach of the millions more who would benefit, loans are going to be a crucial part of the equation. And that means students from all backgrounds -- especially low-income, first-generation and minority students -- need to understand reasonable student debt as an opportunity, not a crushing burden.
Middle- and upper-income families already have that view, which is why they’ve been willing to shoulder modest loans to earn valuable degrees. The vast majority of the increase in aggregate debt over the past few years -- the much-decried $1.3 trillion in student debt -- has come from more Americans pursuing a degree, a public policy success we ought to be celebrating. Millions of Americans have correctly seen higher education as a bridge to a better future.
For low-income and first-generation students, that bridge too often looks like a trap. Even modest loans can be frightening for families that have no experience of college investment, so they’re less willing to take that step. Overblown angst about debt threatens to entrench this class divide in ways that will prove deeply destructive to American higher education.
The promise of a no-loans education is as much about communication as about financing. For our lowest-income students at Carolina, it was meant to overcome the impression of debt as a hardship and a barrier. Our own research showed that it wasn’t a hardship -- students taking out modest loans for a quality education are almost invariably better off. But the perception was so strong among historically disadvantaged families that a no-loans promise for those students made sense.
We’re fortunate to have the resources for such a program, but most colleges and universities don’t -- especially not the regional public universities and community colleges that serve a disproportionate share of first-generation and minority students. If we’re going to move the needle on college access in the United States -- and we must, given our shifting demographics and the economic stakes -- then families have to get comfortable with personal investment in education.
That was certainly the story for my family many years ago. Having grown up in a small Midwest farming community with no resources for college, I took out more than $6,000 to cover my undergraduate education -- a sum that exceeds $41,000 in today’s dollars. And then there was the follow-on debt for graduate and professional education. It was scary, but it was also a privilege to use someone else’s money to improve my life. And that, fundamentally, is what students are doing when they use student loans to pursue an education.
If a “no loans” sentiment takes hold among students and policy makers, it will undermine access to college and make stories like mine less likely. It would reverse the democratization of higher education, devastating community colleges and public universities that are already stretched thin in their effort to serve a diversifying student body.
We badly need a more focused conversation about the right balance of taxpayer money, donor support and other university funds that can offset the cost to students. But in any scenario I can envision, short of creating a true K-16 entitlement, student loans are going to remain a necessary part of the mix.
If we cut off opportunity capital in the name of protecting students or taxpayers, we will end up with less opportunity. The relatively few families who can afford it will continue to buy high-quality, immersive education for their children. And others -- no matter how talented, no matter how driven -- will be left with meager options.
That would be a tragedy, not an improvement. The lamentations of the antidebt crowd assume that policy makers will ride to the rescue with new funding, but it won’t happen. Money is not that plentiful any more -- not from the states, and not from the federal government, despite what some of the presidential campaigns have promised. The students who benefit from higher education are going to remain personally invested, and there’s nothing regrettable about that.
We should stop scaring families with misleading tales of ruinous debt, and stop heeding pundits who would prefer to make education a rarefied luxury. When it comes to opening doors for our most vulnerable students, responsible borrowing is a solution, not a problem.
Shirley Ort is associate provost and director of the Office of Scholarships and Student Aid at University of North Carolina at Chapel Hill.
The U.S. Department of Education introduced a new rule on June 13 that could have an outsize negative impact on historically black colleges and universities.
And no one noticed.
As the former president of Bennett College -- the nation’s oldest historically black college for women -- I have been honored to play a role in increasing the immense opportunities HBCUs have provided to black students and other students of color over the past 150 years.
I have also witnessed the sharp increase of higher education costs, even as the importance of a good college degree continues to grow. Millennials will be burdened with more student loan debt than any other generation before them. According to The Wall Street Journal, cumulative outstanding student debt has surpassed an astounding $1 trillion. Yet with a decline in state and federal support -- states are now spending, on average, 20 percent less per student than they did in 2008, according to one think tank -- colleges and universities are more and more dependent on tuition for their financial stability.
Although HBCUs provide excellent academic opportunities for their students, they do not have the monetary security other colleges and universities enjoy. For example, top-rated HBCU Howard University maintains an endowment of about $660 million, while top-rated non-HBCU Harvard University has an endowment of $36 billion.
This fiscal contrast could become an immediate problem for HBCUs and their students in light of the Education Department’s new proposed rule.
The department recently announced the revised borrower defense to repayment regulation, which would allow students to sue their college or university and default on their loans if they think that the institution misled or defrauded them during the time they were enrolled. The original rule has been around for 20 years and provides essential protections for students who have been defrauded by their educators. The revised rule would greatly expand the criteria for students to sue their educators, with a far lower burden of proof on the student.
While I agree that students must be able to petition their educational providers for student loan forgiveness if they feel they have been defrauded, I worry about the unintended ramifications of such an enormously wide-open regulation. The Education Department has estimated it will have an economic impact of $4.2 billion in tuition repayments and other costs, but that could be just the tip of the iceberg. Institutions could also accumulate mounds of fees, as legal counsels attempt to wade through the vague and confusing regulations -- a cost HBCUs can ill afford.
The new rule has other costs and implications for HBCUs, as well, by requiring institutions to obtain new and costly letters of credit from lenders. HBCUs could be negatively impacted by “financial responsibility regulatory requirements,” which could threaten “their ability to continue their historic education mission,” according to a May 2016 letter from the United Negro College Fund.
My concerns mirror theirs.
According to a Gallup-Purdue University report, black students who graduated from historically black colleges felt more supported, both academically and emotionally, than their black peers at predominantly white institutions. Additionally, HBCUs graduate 18 percent of all African-American undergraduate students and 25 percent of all African-Americans in science, technology, engineering and math fields.
I had the privilege of working alongside many bright young women of color at Bennett who have graduated to become doctors, lawyers, teachers and engineers and have all made significant contributions to the American workforce. And I hope HBCUs can continue to produce such exemplary students of color.
Unfortunately, if this rule is implemented in its current form, opportunities for black students to receive the education they need to compete in the 21st century could decline. HBCUs would be forced to funnel their already limited monetary resources into unnecessary legal counsel instead of into the classrooms where they belong.
The proposed language in the rule is vague, difficult to understand and could cost taxpayers up to $43 billion over the next 10 years. The rule change was doubtless written in reaction to the May 2015 bankruptcy of Corinthian Colleges, a for-profit college system. The federal government may have to forgive millions of dollars in loans Corinthian students now owe. HBCUs are different from for-profit colleges, but the hastily written language of the rule makes no distinction among types of institutions.
We can all agree that students must have strong protections if they can prove they have been defrauded by their academic institution. Those protections already exist, and students should be better informed of their current rights and better empowered to pursue loan forgiveness in the case of legitimate grievances. But that shouldn’t come at the cost of financial instability, especially for HBCUs whose fiscal position is often not as strong as traditionally white institutions. Policy makers should revisit the rule and include HBCUs in the public comment process, which should be extended to take into account an examination of these issues.
I am hopeful that the Department of Education will consider these concerns and invite us to the discussion table before the comment period closes Aug. 1, and will do what’s in the best interest of students, educators and taxpayers. But in the meantime, it’s essential that our community makes our voices heard.
Julianne Malveaux is an author and economist and the founder of Economic Education. She is the former president of Bennett College, America’s oldest historically black college for women.