When people talk about for-profit colleges, they often do so with disdain. If you are concerned about vulnerable people making expensive educational decisions with little information, then you might disdain the “predatory” for-profit schools. If you think that a strong work ethic can trump all manner of troubles, you might disdain the “weak” people who go to a “predatory” school. What is interesting to me is how much disdain is spread among students and schools and how little disdain there is for labor markets.
More than any other kind of college, for-profit colleges are judged by their ability to get their students jobs. And, given their high dropout rates and poor job-placement rates, we often blame them for what are, in fact, labor market failures.
Today’s lingo to describe how we work is “the new economy.” This new economy has produced a new breed of for-profit colleges that constitute a parallel education universe I have dubbed “Lower Ed.” Unlike the mom-and-pop for-profit colleges of yesteryear, these for-profit colleges are massive corporations and have generated billions of dollars in advertising revenue for broadcast and digital media. From the start, they were quite clear with investors and regulators that their market niche was contingent upon deteriorating labor market conditions. Poor labor market outcomes for their graduates (and nongraduates) are part of their business plan.
Job data has become the grounds on which we not only judge the quality of for-profit colleges, but also wage regulatory battles on behalf of the public (consumer) good. Some of this emphasis is due to how we regulate for-profit colleges. The job-placement data is part of the federal gainful-employment regulation that, to summarize, says programs that advertise as pathways to jobs must actually lead to said jobs and provide job-placement data to help students make good choices about their education.
For-profit colleges spend a lot of money pushing back on gainful-employment regulations. Yet through interviews with for-profit college executives, I have discovered that gainful employment is treated more like an unavoidable cost of doing business than the heated political rhetoric would suggest. As one vice president of a national shareholder chain told me with a sigh, “Well, gainful employment is the cost of dealing with the feds.”
The public fights over job-placement data and gainful-employment regulations keep lots of people in business. Politicians look tough when they issue a statement in favor of gainful employment. Regulators relish press releases of cases filed against predatory for-profit colleges based on job-data manipulation. For-profit colleges look like they’re being led by the U.S. Department of Education to add a layer of expensive regulatory compliance against their will. They write to their investors and financial regulators about the “necessary requirements” of complying.
But does regulating job-placement data and gainful employment protect the public interest amid the turmoil of the new economy? It is hard to see how it does. The premise is simple: data makes for better choices. But this assumes that better choices are available, and I’m not sure that they are.
Consider Janice, a 28-year-old black registered nurse who worked in a hospital and enrolled in a for-profit college bachelor’s program. Janice was caught in the middle of a professionalization shift among nurses. Whereas the field had formerly only required a post-high school certificate in nursing, it was increasingly more common to earn a bachelor’s degree in nursing.
That kind of professionalization and educational inflation falls under the “declining internal labor markets” rubric of the new economy. Unlike in the past, when experience and subsequent licensures might be obtained through an employer -- in this case, a hospital -- the expectation now is that workers will increase their human capital at personal expense to “move up” the professional ladder. Janice’s choices for promotion were limited: she could hope for favorable reviews from a sympathetic management culture (a risky proposition) or earn a bachelor’s degree in nursing.
Janice described her workplace culture to me as one where people formed alliances with people who were similar to them. That meant the white nurses congregated with each other at work and sometimes socially. They attended the same nursing program and shared a common knowledge base, all of which felt like a form of exclusion to Janice.
Janice only indirectly attributed this dynamic to race, a distance that is probably similar to how that exclusion feels: ambivalent and hard to identify, but easy to feel. It could be about race only to the extent that so few black R.N.s had their bachelor’s degrees in nursing or had gone to the same nursing program as the nurses who had more management power. And that dynamic could be about race only to the extent that one might be less likely to have the financial means to enroll in the competitive nursing program. Because the program is one of the only ones in the local area to offer the degree, it is routinely at capacity. That means one could apply and be on a waiting list for a year or longer.
Janice felt that she couldn’t afford that kind of time off from greater earnings or promotability. Her ability to “afford” time could be about race and certainly about class and was likely about how all of those are always interacting at the same time. For Janice, time and access were expensive in ways that the debt she incurred attending a for-profit degree program in nursing was not.
Janice’s “choices” were instructive. In fact, of the 109 students formerly or presently enrolled in for-profit colleges that I interviewed between 2011 and 2015, no one talked about the context of their college choices in ways that would suggest that more accurate or clear job-placement data would have changed their circumstances or decisions.
Instead, they talked about a credential as insurance against risks they could not continue to bear alone. JJ, a military veteran at a for-profit college, was particularly exasperated by the nonchoices available to him. Community college was infantilizing. Traditional four-year colleges were impractical. Why should people who have served their country have to “start over,” was the gist of his argument.
I’ve led grown men in the battlefield. I’ve managed over $1.5 million of mission-critical assets at any given time. I’ve taken weeks strait [sic] of leadership development courses. I’ve been directly responsible for soldiers’ lives. I needed a piece of paper that would translate my expertise to employer terms.
What JJ really needed was to not need a credential at all. It was only when the conditions of the labor market devalued his and Janice’s experiences that they considered college. Job statistics won’t change the conditions of the labor market for people.
A Negative Social Insurance Program
Political wrangling over job statistics looks like action, but it is mostly a distraction. Sociologist David Brown has shown that credentials can be created without jobs to justify them. We produce risky credentials when how we work changes dramatically, and the way we work shapes what kind of credentials we produce. If we have a shitty credentialing system, in the case of for-profit colleges, then it is likely because we have a shitty labor market.
To be more precise, we have a labor market where the social contract between workers and the work on which college has previously relied has fundamentally changed and makes more workers vulnerable.
Substantial evidence suggests all of the changes have shifted new risks to workers. Employer tenure for young workers has dropped at the same time that part-time and temporary work has increased, meaning many workers expect to change jobs more frequently. Essentially, their employment is constantly temporary. As the rhetoric goes, the new economy values knowledge workers with cognitive skills, and degrees represent those kinds of skills. If that’s the case, the new economy has shed high-paid but low- to midskilled cognitive work in favor of high-skilled labor and low-wage, low-skilled labor. The best-case scenario proposes that this is a decade-long labor-market correction. The labor market will catch back up and millions will find themselves back in “middle-skill” jobs with middle-class wages and work conditions.
In this best-case scenario, workers have taken on debt waiting for the market to correct itself. Depending on the kind of debt and who took it on, it’s either manageable or crushing. And for the most vulnerable workers, the only way to remediate some of that debt is to accrue more of it by going back to school. If for-profit colleges like ITT are no longer around, then another form of short-term, on-demand credentials will respond to consumer demand by extracting profit from student loans and education savings accounts.
It is not an accident that financialized shareholder for-profit colleges expanded in the 2000s. Changes in how we work created demand for fast credentials. The federal student aid system made those credentials “cheap,” in the sense that students do not pay much for them up front. The new economy, by all accounts, will require all of us to maintain near-constant skills training so as to be employable and put a far greater onus on individuals to extend their education.
So far, our policy has been to rely on the student loan system to finance that onus. To the extent that has fueled for-profit colleges, our government response has positioned them as social insurance against labor-market innovation (or disruption, depending on your perspective). Let me be clear: these are all conditions that are expected to sustain, if not accelerate, individual costs for job retraining repeatedly over the working life course.
Our national response has been to increase public money to private profit-extraction regimes. That is, in effect, a negative social insurance program. Whereas actual social insurance, like Social Security, protects citizens from the vicissitudes of predatory labor-market relationships, negative social insurance does not.
A negative social insurance program positions private-sector goods to profit from predictable systemic social inequalities, ostensibly for the public good. How did for-profit colleges define their market? They said that greater inequalities in secondary schooling produced demand for higher education without a viable means for millions of people to attain it. They said that employers were less interested in providing in-house corporate training and more desiring of credentials to certify work experience. They said that the military and other public-sector employers were shedding jobs. These aren’t secrets.
If the new dominant work arrangement divests employers of the cost for their employees’ training or certification, workers will pursue certification and credentialing schemes. If we know the cost of those schemes is primarily funded through taxpayer-supported federal student aid programs, then we already have a mechanism for providing social insurance. But when we facilitate spending that benefits institutions that maximize cost to extract profit, we have perverted the public-good mission of social insurance.
Early in 2016, I attended a conference where people in education technology offered everything from online platforms for massive open online courses to financing schemes to help people borrow private money for short-term coding boot camp courses. In their presentations, they depicted a future of work where employers couldn’t find enough “on-demand,” “skilled” labor for “the jobs of the future.” They showed earnings gaps between those with credentials and those without. They described how inefficient graduate programs at traditional universities are because they ramp up too slowly, cost too much and take too long to finish.
Yet we know that tech jobs are disproportionately filled with white and Asian men and that the tech industry has demonstrated problems hiring and promoting women and ethnic and racial minorities. Like the early days of for-profit colleges’ Wall Street era, the new credentialism promises credentials in high-wage, high-demand jobs that have statistical discrimination baked into them.
New institutions and new credentials are by definition lacking in prestige, the kind of prestige that lower-status workers and students need for their credential to combat discrimination in the labor market. Opening the federal student aid spigot without paying attention to how this ends for the poorest makes us all vulnerable. And turning on the spigot is precisely where we seem to be going.
In 2015, the Education Department launched a pilot program to help people like those boot camp coders use federal student aid money to pay for their programs. Organizations that participate in the program could apply for a special waiver of regulatory and statutory requirements usually associated with gaining access to federal student aid. They didn’t have to offer a degree or a certificate, usually defined by some standard credit hour of attendance, or be accredited. This program -- the Educational Quality Through Innovative Partnerships (EQUIP) -- is said to encourage and reward “entrepreneurialism” in the higher education sector. The impetus? The jobs of the 21st century need mobile workers with specialized skills that employers will not pay for. It is the same pitch that shareholder for-profit colleges made to investors in the 1990s.
The proposed future of higher education looks a lot like the start of the Wall Street era of for-profit college expansion: occupational credentials in narrow fields, paid for through public financing schemes that start with exemplars of high-status white men in high-pay jobs and offer little hope for anyone else. By 2016, we knew how this ended for shareholders of for-profit colleges, but we’ve not yet fully counted the social cost. Meanwhile, one wonders how high student loan defaults, constrained choices, predictably poor job outcomes and negligible upward social mobility for those trapped in Lower Ed serve the public good.
A survey of 7,000 freshmen at colleges and universities around the country found just 6 percent of them able to name the 13 colonies that founded the United States. Many students thought the first president was Abraham Lincoln, also known for “emaciating the slaves.” Par for the course these days, right?
It happens that the study in question was reported in The New York Times in 1943. The paper conducted the survey again during the Bicentennial, using more up-to-date methods, and found no improvement. “Two‐thirds [of students] do not have the foggiest notion of Jacksonian democracy,” one history professor told the Times in 1976. “Less than half even know that Woodrow Wilson was president during World War I.”
Reading the remark now, it’s shocking that he was shocked. After 40 years, our skins are thicker. (They have to be: asking the current resident of the White House about Jacksonian democracy would surely be taken as an invitation to reminisce about his “good friend,” Michael.)
The problem with narratives of decline is that they almost always imply, if not a golden age, then at least that things were once much better than they are now. The hard truth in this case is that they weren’t. On the average, the greatest generation didn’t know any more about why The Federalist Papers were written, much less what they said, than millennials do now. The important difference is that today students can reach into their pockets and, after some quick thumb typing and a minute or two of reading, know at least something on the topic.
How to judge all this is largely a question of temperament -- of whether you see their minds as half-empty or half-full. Tom Nichols conveys the general drift of his own assessment with the title of his new book, The Death of Expertise: The Campaign Against Established Knowledge and Why It Matters, published by Oxford University Press. The author is a professor of national security affairs at the U.S. Naval War College and an adjunct professor at the Harvard Extension School.
He sees the longstanding (probably perennial) shakiness of the public’s basic political and historical knowledge as entering a new phase. The “Google-fueled, Wikipedia-based, blog-sodden collapse of any division between professionals and laymen, students and teachers” is like a lit match dropped into a gasoline tanker-sized container filled with the Dunning-Kruger effect. (It may seem comical that I just linked to Wikipedia to explain the effect, but it’s a good article, and in fact David Dunning himself cites it.)
Nichols knows better than to long for a better time before technology shattered our attention spans. He quotes Alexis de Tocqueville’s observation from 1835: “In most of the operations of the mind, each American appeals only to the individual effort of his own understanding.” This was basic to Jacksonian democracy’s operating system, in which citizens were, Tocqueville wrote, “constantly brought back to their own reason as the most obvious and proximate source of truth. It is not only confidence in this or that man which is destroyed, but the disposition to trust the authority of any man whatsoever.”
The difference between a self-reliant, rugged individualist and a full-throated, belligerent ignoramus, in other words, tends to be one of degree and not of kind. (Often it’s a matter of when you run into him and under what circumstances.) Nichols devotes most of his book to identifying how 21st-century American life undermines confidence in expert knowledge and blurs the lines between fact and opinion. Like Christopher Hayes in The Twilight of the Elites, he acknowledges that real failures and abuses of power by military, medical, economic and political authorities account for a good deal of skepticism and cynicism toward claims of expertise.
But Nichols puts much more emphasis on the mutually reinforcing effects of media saturation, confirmation bias and “a childish rejection of authority in all its forms” -- as well as the corrosive effects of credential inflation and “would-be universities” that “try to punch above their intellectual weight for all the wrong reasons, including marketing, money and faculty ego.” Unable to “support a doctoral program in an established field,” Nichols says, “they construct esoteric interdisciplinary fields that exist only to create new credentials.”
Add the effect of consumerism and entertainment on the academic ethos, and the result is a system “in which students learn, above all else, that the customer is always right,” creating a citizenry that is “undereducated but overly praised” and convinced that any claim to authoritative knowledge may be effectively disputed in the words of the Dude from The Big Lebowski: “Yeah, well, you know, that’s just, like, your opinion, man.”
As a work of cultural criticism,The Death of Expertise covers a good deal of familiar territory and rounds up the usual suspects to explain the titular homicide. But the process itself is often enjoyable. Nichols is a forceful and sometimes mordant commentator, with an eye for the apt analogy, as when he compares the current state of American public life to “a hockey game with no referees and a standing invitation for spectators to rush onto the ice.”
But one really interesting idea to take away from the book is the concept of metacognition, which Nichols defines as “the ability to know when you’re not good at something by stepping back, looking at what you’re doing, and then realizing that you’re doing it wrong.” (He gives as an example good singers: they “know when they’ve hit a sour note,” unlike terrible singers, who don’t, even if everyone else winces.)
“The lack of metacognition sets up a vicious loop, in which people who don’t know much about a subject do not know when they’re in over their head talking with an expert on that subject. An argument ensues, but people who have no idea how to make a logical argument cannot realize when they’re failing to make a logical argument …. Even more exasperating is that there is no way to educate or inform people who, when in doubt, will make stuff up.”
The implications are grave. In 2015-16, Donald Trump ran what Nichols calls “a one-man campaign against established knowledge,” and he certainly pounded the expertise of most pollsters into the dirt. He is now in a position to turn the big guns on reality itself; that, more than anything else, seems to be his main concern at present. Nichols writes that research on the Dunning-Kruger effect found that the most uninformed or incompetent people in a given area were not only “the least likely to know they were wrong or to know that the others were right” but also “the most likely to try to fake it, and the least able to learn anything.” That has been shown in the lab, but testing now continues on a much larger scale.
Few topics in higher education are getting more attention than credential innovation: making credentials digital, introducing new credential types and communicating more information about learning outcomes.
Credential innovation moves transcripts, certificates and diplomas beyond accounting and verification records for transfer and graduate admissions, or mechanisms for validating completion of a university’s degree program. For many institutions, a clear driver of innovating the form and function of their credentials is a belief that today’s transcripts do not communicate what employers care about.
There are two ways to think about this issue. It’s possible that credentials don’t communicate what employers care about because colleges don’t actually provide what the labor market wants. And plenty of people say that. But I (very emphatically) believe that’s generally not true. In fact, credential innovation is so important because colleges do provide much of what employers are looking for. The problem instead is that they just don’t assess, document and communicate those outcomes. So the information is lost, and students are left to their own devices to effectively represent their collegiate experience. And the impact of colleges and universities is left implicit, not explicit.
If higher education institutions do equip students with much of what employers want, and credentials are the mechanism for communicating those outcomes per student to an employer, then the foundational question animating credential innovation projects should be: What do employers want to know? A vice provost of academic affairs recently asked me this question, and it’s stuck with me ever since.
In fact, I heard the same question, frequently, at a recent Lumina Foundation gathering where community colleges, universities and third-party vendors shared how they are experimenting with a comprehensive student record. It’s a national question for higher education as a whole, but even more so a geographically local and industry-specific question that every institution needs to consider distinctly. Let’s face it, there is no such thing as the unitary actor “employer” any more than there is such a thing as the unitary actor “higher education.”
Surprisingly, too many credential innovation projects lack grounding in clear answers to this simple question. Like the proverbial drunk looking for car keys where the light is brightest, they start with a clear conception of what the college or university can say beyond courses and credits, not a clear conception of what their employer market has asked to understand about their graduates. That makes sense insofar as colleges and universities (should) each have a distinct educational mission and program, and it is the outcome of that program or mission that credentials should communicate. But if the goal is better communication and alignment, leaving the question of employer interest unasked makes it much less likely that the underlying goal of credential innovation will be achieved.
In 2014, Hart Research Associates conducted an online survey among 400 employers on behalf of the Association of American Colleges and Universities. The majority of employers said that possessing both field-specific knowledge and a broad range of knowledge and skills is important for recent college graduates to achieve long-term career success. In fact, 80 percent of employers said it would be very or fairly useful to see an electronic portfolio that summarizes and demonstrates a candidate’s accomplishments in key skill and knowledge areas, including effective communication, knowledge in their field, applied skills, evidence-based reasoning and ethical decision making. It is important to note that, along with course work, those are the very types of capabilities that institutions help students build through activities like co-curricular leadership, study abroad and faculty research collaborations.
Today, community colleges and regional four-year schools often work closely with a particular employer or set of employers that define their local economy. The Kentucky Community and Technical College system, the North Carolina Community College system and many other institutions and systems align programs and assess outcomes in a way that’s highly aligned with what those employers are looking for.
What appears less common is a national dialogue between institutions and employers, particularly regarding white-collar jobs -- the kind of jobs where writing well, speaking well, thinking analytically and being comfortable with numbers are important skills to have. (Those are the types of skills that I describe as desired outcomes for my students as an assistant research professor of sociology at Arizona State University.)
Of course to ground credential innovation in an understanding of what employers want, institutions must begin the credential innovation journey by determining which employers matter to them and their students -- in particular those who actually consume their credentials today -- so they can ask how they use credential documents and what they think about them.
Digital credential platforms like Parchment, where I work, enable registrars to see where their credentials are going, which can then be the pathway for opening up this discussion. For example, we observe in Parchment’s data that Ernst & Young collects academic credentials, as does the Department of Homeland Security. Other national employers that receive transcripts from applicants via Parchment include Boeing, Deloitte, Hewlett-Packard, NASA, the U.S. Army, the U.S. State Department and Wells Fargo, to name a few.
Asking what employers want to know inevitably sparks a second question: Given what employers are looking for, does our educational environment develop that in students? While I think it does (and far more than we’re given credit for), the experiences are not necessarily tracked or assessed and can span multiple information systems. They may not have the rigor and integrity that the recording of course work involves, and by choosing to organize course work into a priori majors and minors, clusters of courses that reflect particular skills or ways of thinking can be lost. While program innovation is no doubt needed to meet employers’ needs, introducing a better and more formal assessment of what we’re already doing becomes low-hanging fruit we can harvest without redesigning programs or creating new ones to meet employers’ needs.
This brings me to a third and final question: How do we document and communicate this information? The answer is providing a credential in a certified and succinct way, in an operationally efficient way, and in a way that has reflective/scaffolding value to the learner so they can maximize their time in college to prepare themselves for the labor market.
For example, Elon University has provided a co-curricular, or experiential, credential for many years as part of its educational experience, which includes undergraduate research, global engagement, leadership, service and internship. Although the experiences were not documented originally with employers in mind (they reflect Elon’s distinctive educational philosophy), the university recently began surveying employers who received their new experiential transcript about what was useful and what was not. They found that 75 percent of employers agreed that the experiential transcript provided useful information for the hiring process, while 44 percent agreed that the experiential transcript increased the chances that an applicant would get an interview.
Communicating credential information effectively means supporting the development of data standards, as today’s national employers increasingly rely on applicant tracking systems with algorithms that use a wide variety of data sources to evaluate prospective employees. According to a 2016 report from Gartner, an information technology research and advisory firm, those algorithms will replace both manual processing of CVs (résumés) by recruiters and automated CV ranking based on word matching. The beauty of data is that you can convey a superset of information and package it into different types of credentials for different audiences (transfer transcript, employer college experiences report), and align with the data-processing practices of employers.
If you are a higher education leader beginning the credential innovation conversation, consider these three questions. What do our employers want to know about our learners? What do we need to do programmatically to accurately and reliably communicate those outcomes? And, recognizing the various audiences and purposes that credentials serve, what form of credential can best communicate it?
I encourage you to look at your credentials with fresh eyes as a currency for opportunity and the key to reaffirming a transformative value of a postsecondary education, especially in a knowledge economy. By taking a new approach to academic credentials, higher education can give employers what they are looking for and help students turn those credentials into opportunities.
Matthew Pittinsky is an assistant research professor at Arizona State University and CEO of Parchment Inc., a digital transcript company in K-12 and higher education.
As someone who has been a faculty member and a dean, as well as a college president, and who has worked on college campuses as a consultant for the last decade, I believe that U.S. Secretary of Education Betsy DeVos was describing an alternative reality to that on most campuses when she insisted to college students at the recent Conservative Political Action Conference, “The faculty, from adjunct professors to deans, tell you what to do, what to say and, more ominously, what to think.”
Such a blanket characterization of the professoriate and college administrators, or what DeVos calls the “education establishment,” distorts a basic fact: at most colleges and universities in the United States -- public and private, sectarian and nonsectarian -- faculty members and administrators are dedicated to teaching their students to think independently and critically in order to prepare them, as Thomas Jefferson advocated, to be part of an educated citizenry. Although occasional news stories appear about faculty members -- not all of whom are liberal in their politics -- making political statements in their classes, such moments are far from the norm.
The diversity of institutions of higher education in the United States is one of this country’s greatest strengths. Nevertheless, even with that diversity, with few exceptions, most American college and universities put at the heart of their curricula the goal of teaching students critical inquiry and what Harvard University and others call “new ways of understanding and new ways of knowing.” Students are routinely taught to reflect on and to challenge what they read, hear and even think. They are taught to argue logically and to support their arguments with evidence. They are encouraged to embrace complexity or, as Ralph Ellison’s Invisible Man put it, to become “acquainted with ambivalence.” Those matters are at the heart of what most faculty members and administrators seek for their students.
A look at three university mission statements will illustrate the point.
Harvard, which is often the model for other institutions, puts it this way in its mission statement: “Beginning in the classroom with exposure to new ideas, new ways of understanding and new ways of knowing, students embark on a journey of intellectual transformation. Through a diverse living environment, where students live with people who are studying different topics, who come from different walks of life and have evolving identities, intellectual transformation is deepened and conditions for social transformation are created. From this we hope that students will begin to fashion their lives by gaining a sense of what they want to do with their gifts and talents, assessing their values and interests, and learning how they can best serve the world.”
Dominican University of California, for which I've consulted in the past, has a similar mission, which it describes both more succinctly and in some ways more expansively: “Dominican educates and prepares students to be ethical leaders and socially responsible global citizens who incorporate the Dominican values of study, reflection, community and service into their lives. The university is committed to diversity, sustainability and the integration of the liberal arts, the sciences and professional programs.”
Princeton University, too, highlights its “commitment to innovation, free inquiry and the discovery of new knowledge and new ideas” and then emphasizes -- as most institutions do -- that this commitment must be “coupled with a commitment to preserve and transmit the intellectual, artistic and cultural heritage of the past.”
It may also be that DeVos has failed to learn one of the key tenets of effective argument, that (as the cliché has it) one swallow does not make a summer. Rather, she may be basing her notion of what happens on college campuses on her limited experience as a student at and graduate of Calvin College, which is clear in its mission statement that one of its primary goals relates to “shaping and confirming the values” that will guide Calvin College students for the “rest of their lives.”
Specifically, unlike many other faith-based colleges that welcome students of all faiths and no faiths, Calvin College, in its expanded mission statement, is absolutely clear that it seeks to ensure that the values guiding their students “will be Christian ones, in accordance with biblical revelation. Such a view provides both the coherence of our curriculum and a goal for our curriculum. Its relationship with the denomination provides the college with moral authority.”
Calvin College, as a private institution, is within its rights to structure its programs in the way it does. But its approach of seeking to significantly influence the values, religious or otherwise, of its students is far from typical.
DeVos may also be underestimating the ability of college students, a population that includes adults as well as traditional-age students, to think for themselves. The protests that have appeared on numerous campuses in recent years indicate that many students are not blithely accepting institutional policies and practices on an array of levels. The involvement of students in votes of no confidence in their presidents (at places as diverse as Florida Atlantic University, Ithaca College, St. Louis University and Transylvania University) further gives the lie to any assumption of student complacency and unthinking acquiescence to authority.
Because of her limited experience with higher education, I would encourage DeVos to educate herself about the many strengths of American higher education and the widespread commitment to the values of the liberal arts. Because of space limitations, I am suggesting only a brief list. Others might wish to send their suggestions directly to the secretary at the Department of Education.
For a comprehensive view of the value of the liberal arts, she might begin with Fareed Zakaria’s In Defense of a Liberal Education.
To understand the profound impact higher education can have on students from underprivileged backgrounds, I urge her to read Ron Suskind’s account of how the life of a first-generation college student, Cedric Jennings, was transformed by his education at Brown, A Hope in the Unseen.
She might also look at a report released earlier this week by the Association of American Colleges and Universities, “On Solid Ground,” which points to the importance of assessing student achievement in “critical thinking, written communication and quantitative literacy.”
Finally, I would encourage DeVos to read the President’s Message written by Patricia McGuire of Trinity College, a Catholic institution in Washington, D.C., which defines “the quantities essential to effective leadership in our ever-changing global environment.” These are: “The ability to think critically, to write and speak clearly, to make ethical judgments, to know the context of history and literature, to understand the fundamental economic and political forces affecting the psychology of whole peoples.”
Susan Resneck Pierce is president of SRP Consulting, LLC. Her most recent books are On Being Presidential (Jossey-Bass 2011) and Governance Reconsidered (Jossey-Bass 2014).
Calhoun College is gone with the wind. Last month, Yale University announced it was removing the name of 19th-century politician and pro-slavery alumnus John C. Calhoun from one of its residential colleges. In the same week, Centre College of Kentucky reported that the name of the late U.S. Supreme Court Associate Justice John C. McReynolds was being removed from a campus building because donor McReynolds’s record of racial intolerance and anti-Semitism in public office from 1914 to 1941 was at odds with Centre's mission and values. The decisions at both institutions followed thoughtful deliberations.
Previously, on Jan. 25, the University of South Carolina honored the memory of Richard T. Greener, the university’s first African-American professor, by unveiling a model of a statute “to commemorate the Reconstruction-era pioneer.” Education professors Christian Anderson and Katherine Chaddock (emerita) teamed with art history professor Lydia Brandt and graduate students in the higher education program to lead efforts over several years to recognize Greener (who also was the first African-American graduate of Harvard College).
An irony of the dedication event was that the Greener statue is adjacent to the Thomas Cooper Library, named in honor of a legendary university president from 1820 to 1833 whose teaching of Calhoun’s “nullification theory” was popular with students. Cooper’s curriculum of political economy, combined with emphasis on oratory, helped to make the University of South Carolina the alma mater of numerous Southern governors, senators and congressmen who championed slavery and states’ rights in national politics up to the state's secession in 1861.
All this attention to monuments means that history matters on the American campus. And the motley mix of building names shows that the heritage of higher education is complex and even conflicting in its symbols and celebrations. Deletions and additions of campus figures can be at best indicative of healthy renewal, both in their process and decisions. What’s also wise is that colleges and universities take care not to erase all symbols and signs of the partisan slavery-advocating alumni and politicians of an earlier era -- a point that Yale President Peter Salovey made in the letter he released announcing the removal of Calhoun’s name.
Why is such an accommodation important? Each day on my way to the University of Kentucky, I pass by a state historic marker on the sidewalk noting the apartment where Jefferson Davis, later president of the Confederacy, roomed when he was an undergraduate from 1821 to 1824 at nearby Transylvania University. Even though I may not celebrate Davis’s leadership legacy, the historical marker assures that I remember him and daily confront his historic presence at his alma mater and as part of the heritage of the city and the state. Other campuses elsewhere should retain some symbols and signs of their comparable legacy even if -- or rather, especially if -- those historic figures conflict with contemporary ideals of social justice and political equity. This is important because colleges and universities are historic institutions -- and genuine history is complex and not for the forgetful or fainthearted.
Colleges and universities face unfinished business in some other areas of heritage and representation in their campus monuments. For starters, at many campuses few buildings are named to honor distinguished women. That neglect may be changing. Yale has shown imagination and sound values by renaming the former Calhoun College in honor of Grace Murray Hopper, who earned her master’s degree and Ph.D. from the university in the 1930s and then was a pioneer in computer science, as well as serving as a rear admiral in the U.S. Navy. Will other colleges and universities follow Yale’s example?
At the University of California, Berkeley, I would like to see a building named to honor Laura Nader, who has been an internationally acclaimed anthropology professor for more than a half century. An appropriate historic site would be the Faculty Club, where newcomer Nader persuaded a few women colleagues in 1960 to join her in climbing through a window to attend a universitywide faculty meeting. It was the only way they could gain entrance to the all-male faculty club building. Her pioneering act in a long, distinguished scholarly career deserves to be marked prominently for all to see while walking across the campus.
If women are underrepresented in the names on our campus buildings, in stark contrast, donors are abundant. For starters, just consider the names of entire institutions: Brown, Carnegie, Clark, Cornell, Duke, Harvard, Johns Hopkins, Rice, Tulane, Vanderbilt, Vassar and Yale, for starters. Donors also have added power to shape institutional memory because they have the right to have a building named in honor of a campus figure they designate. Given American higher education’s historical dependence on philanthropy, it’s understandable that our colleges and universities acknowledge generous gifts with naming rights. However, recent controversies and changes signal that each institution ought to draft a thoughtful protocol to consider -- and reconsider -- names honored on buildings and in other memorials. That might reduce the volatile clashes and hasty name-removing decisions that have surfaced recently.
Perhaps the worst abuses in naming campus buildings does not come from controversial political disputes but thoughtless choices that lead to a lack of historic distinction. When a college or university gives monumental recognition to really undistinguished people, the danger is that the campus architecture becomes uninspiring.
In teaching the history of higher education, I have tried to provide an antidote to nondescript monuments. I ask students to walk a campus to observe in detail whom the buildings honor -- and whom they overlook. My key question, then, is to ask how they would amend this -- and why.
The numerous, thoughtful suggestions were dramatic. Perhaps the biggest surprise to me came in 1983 when I was teaching at The College of William & Mary and one of my graduate students was an African-American woman who was a graduate of Tuskegee University. She recommended that Tuskegee award an honorary degree to George Wallace. At first, most students and I were incredulous and thought she was joking. But she was both serious and wise. She explained that since Wallace had repented, an honorary degree from Tuskegee could represent both remembering and healing for all Alabamians. About a year and a half later, her suggestion became a reality when the university did indeed give Wallace such a degree.
All members of the class learned from this. It provides a good reminder why I encourage all campus constituencies to join in this exercise to bring history to life with deliberate, distinctive monuments. It’s an investment in time that can connect past and present as we join together to consider diverse significant people and achievements within the historic campus setting.
John Thelin is a professor at the University of Kentucky and author of A History of American Higher Education.
The Milo Yiannopoulos pedophilia scandal ought to prod some serious soul searching on the part of American conservatives, especially his college Republican hosts. After all, until his ugly and obscene remarks jokingly condoning the sexual abuse of young boys finally discredited him, Yiannopoulos had been given the red carpet treatment by the right. Across America, college Republican groups had eagerly invited him to campus despite -- or maybe because of -- the crude and obscene insults he hurled at students of color, women and transgender students.
The adult right off the campus was no better. The Conservative Political Action Conference had invited Yiannopoulos to speak at the same event as Vice President Pence, only rescinding the invitation after tapes surfaced of Yiannopoulos making light of pedophilia, which caused an uproar.
Prior to the Yiannopoulos scandal, most media criticism had focused not on his rants but on the Left’s disruptions of this alt-right provocateur’s campus tour. That criticism was appropriate since those who would ban Yiannopoulos or violently disrupt his talks are guilty of corroding free speech, which is the lifeblood of the university. But this spotlight on the campus Left was so intense that little was said about how much the Yiannopoulos affair tells us about the sorry state of campus conservatism.
The discourse in Yiannopoulos campus speeches at times descended into obscenity and cruelty. In his speech at West Virginia University, he referred to women as “cunts.” At the University of Wisconsin at Milwaukee, he not only used obscenity, but he did so in a vicious manner to mock a transgender student in the audience, projecting a photo of this student onto a screen in the lecture hall (which was live streamed on the Breitbart website) and saying of her “the way you know he’s failed is I can still bang him.” This was apparently done in the service of his transphobia, which led him to say at the University of Delaware: “Never feel bad for mocking a transgender person. It is our job to point out their absurdity, to not make the problem worse by pretending they are normal.”
In defending their decision to host Yiannopoulos’s talks, the campus Republicans never mentioned his obscene and defamatory rhetoric. Instead, they spoke abstractly about freedom of speech and presented themselves and Yiannopoulos as the embodiment of that cherished freedom -- standing up for his right to speak despite the dangers from a censorious Left. At the University of California at Berkeley, in fact, site of the most violent disruption of a Yiannopoulos talk, his host, the Berkeley College Republicans presented themselves as heroic freedom fighters, heirs of Berkeley Free Speech Movement leader Mario Savio: “We proceed fearlessly because we know we have the president of the United States and the United States Constitution on our side. The Berkeley College Republicans are the new Free Speech Movement,” wrote members Troy Worden and Pieter Sittler in Berkeley’s Daily Californian.
But while Free Speech Movement of 1964, of course, defended free speech, it did not champion obscene, degrading speech. Search as you may you will not find a single obscenity in any of Free Speech Movement leader Mario Savio’s speeches (as you would expect of a former altar boy). And neither Savio nor any Free Speech Movement speaker would have even dreamed of using a campus podium -- or any other -- as Yiannopoulos has, to defame or ridicule students on account of their sexuality.
So Yiannopoulos’s Republican campus hosts are at miscast as the Free Speech Movement’s political descendants. If there is any free speech dispute from Berkeley in the 1960s that the Yiannopoulos affair resembles (and even here the resemblance is limited) it is the obscenity controversy that erupted in spring 1965, a semester after the Free Speech Movement. That controversy concerned the right to use the obscene word “Fuck” in public campus discourse. Some Free Speech Movement veterans supported this right, and others (like Savio) objected to the punishment of obscenity protesters on due process grounds. But most movement veterans and much of the Berkeley student body refused to rally to this cause because they felt that this use of obscenity was irresponsible and distracted from more serious issues facing the civil rights and antiwar movements.
That’s why journalists who labeled this obscenity affair “the Filthy Speech Movement” erred, as it was impossible to build a mass movement at Berkeley in defense of obscene speech, impossible to re-assemble the old Free Speech Movement coalition for such a cause. Most of the Berkeley student body in 1965 was too wedded to the ideal of responsible political discourse to wave the “Fuck” banner. In this sense they were more genuinely conservative than today’s Berkeley College Republicans who not only wink at Yiannopoulos’s obscenity, but also at its use to defame minority students.
To be clear, Berkeley students in 1965 were not endorsing suppression, but close to 80 percent opposed the use of such “filthy speech” in public, as I note in my book, Freedom’s Orator: Mario Savio and the Radical Legacy of the 1960s (Oxford University Press, 2009). And so, they were unwilling to battle for the cause of obscene speech, a cause they thought irresponsible.
The idea here is that with freedom comes responsibility, and that ought to lead you (especially if you are conservative) to question whether saying “Fuck” from the podium or bringing Yiannopoulos’ ugly vitriol on to campus is responsible -- even though you have the right to do so. This is what Mario Savio was referring to in his Free Speech Movement victory rally speech, Dec. 9, 1964, when he said: "We are asking that there be no, no restrictions on the content of speech save those provided by the courts. And that's an enormous amount of freedom. And people can say things in that area of freedom which are not responsible. Now... we've finally gotten into a position where we have to consider being responsible, because we now have the freedom within which to be responsible."
Thinking seriously about free speech involves much more than reciting a simple formula that says “anything goes,” leaving us racing out mindlessly to speak and invite others to speak without considering what is being said, how it is being said, and who those words may be hurting gratuitously. That point was made decades ago in the Woodward Report at Yale University, a classic statement on the “university’s primary obligation to protect free expression,” but which also stressed the “ethical responsibilities assumed by each member of the university community” that are “of great importance. If freedom of expression is to serve its purpose, and thus the purpose of the university, it should seek to enhance understanding. Shock, hurt and anger are not consequences to be weighed lightly. No member of the community with a decent respect for others should use, or encourage others to use, slurs and epithets to discredit another’s race, ethnic group, religion or sex. It may sometimes be necessary in a university for civility and mutual respect to be superseded by the need for free expression. The values superseded are nevertheless important, and every member of the university community should consider them in exercising the fundamental right to free expression."
I suspect that the failure of campus conservatives to take seriously such questions about responsibility and civility in this Yiannopoulos affair is connected to the influence of Donald Trump. He rose to the presidency through his constant, ugly barrages of ad hominem attacks, and in spite of being caught on tape discussing how he grabs women “by the pussy.” This seems to have given a green light for even the crudest of public oratory. The right proved itself willing both off campus and on to dispense with notions of civility, so long as the vitriol emanates from someone on their side who succeeded in generating mass appeal -- either at the ballot box (Trump) or in the lecture hall (Yiannopoulos). It was not the crudeness or cruelty of his campus speeches, but a two-year-old taped interview that caught Yiannopoulos’ amoral ramblings on sexual abuse of minors that finally led CPAC and Breitbart News to drop him.
This is a tale not merely of moral declension on the right but also conservative (or pseudo-conservative) intellectual decline. Again there is a parallel between Trump and Yiannopoulos, both of whom go in for hectoring rather than logical discourse and are more concerned with drawing and exciting large crowds with shock jock sloganeering and show biz gloss (Yiannopoulos, who refers to himself as “a star,” and in his campus gigs uses spotlights, Broadway-style lighting and huge photo displays of himself a la Hollywood) than with intellectual gravitas. Yiannopoulos offered College Republicans this spectacle and they lapped it up despite its intellectual vacuity, ad hominem cruelty, and bigotry sugar coated by snarky humor.
The contrast between, this and the serious, civil Left-right debates the Young Americans for Freedom sponsored in the 1960s -- modeled after those held by their intellectual godfather, William F. Buckley, Jr. on his TV show Firing Line -- could not be more striking. It is as if the student right has forgotten what serious political thought and reasoned oratory look and sound like. In this sense the right on campus today, has, as Savio once put it, “free speech but nothing left to say.”
Robert Cohen is a professor of history and social studies at NYU Steinhardt whose books include Freedom's Orator: Mario Savio and the Radical Legacy of the 1960s; The Essential Mario Savio: Speeches and Writings that Changed America; The Free Speech Movement: Reflections on Berkeley in the 1960s (co-edited by Reginald E. Zelnik)
“All eras in a state of decline and dissolution are subjective,” said Goethe in a moment of sagely grumbling about the poets and painters of the younger generation, who, he thought, confused wallowing in emotion for creativity. “Every healthy effort, on the contrary, is directed from the inward to the outward world.”
I didn’t make the connection with Svend Brinkmann’sbook Stand Firm: Resisting the Self-Improvment Craze until a few days after writing last week’s column about it. One recommendation in particular from the Danish author’s anti-self-help manual seems in accord with Goethe’s admonition. As Brinkmann sees it, the cult of self-improvement fosters a kind of bookkeeping mentality. We end up judging experiences and relationships “by their ability to maximize utility based on personal preferences -- i.e. making the maximum number of our wishes come true.” The world becomes a means to the ego’s narrow ends, which is no way to live.
Besides offering a 21st-century guide to the Stoic ethos of disinvestment in the self, Brinkmann encourages the reader to rediscover the world in all its intrinsic value -- its fundamental indifference to anybody’s mission statement. How? By spending time in museums and forests:
“A museum is a collection of objects from the past (near or distant), e.g. art or artifacts that say something about a particular era or an aspect of the human experience. Obviously, you learn a lot from a museum visit -- but the greatest joy lies in just reveling in the experience with no thought of how to apply the knowledge and information. In other words, the trick is to learn to appreciate things that can’t be ‘used’ for some other function....
Similarly, a walk in the woods gives us a sense of being part of nature and an understanding that it shouldn’t be seen as consisting of resources that exist merely to meet human needs and desires. ... There are aspects of the world that are good, significant, and meaningful in their own right -- even though you derive nothing from them in return.”
Making similar points from a quite different angle is The Usefulness of Useless Knowledge by Abraham Flexner (1866-1959), the founding director of the Institute for Advanced Study, in an edition from Princeton University Press with a long introduction by the institute’s current director, Robbert Dijkgraaf.
The essay giving the book its title first appeared in Harper’s magazine in October 1939 -- a few months into the New York World’s Fair (theme: The World of Tomorrow) and just a few weeks into World War II. “I [am] pleading for the abolition of the word ‘use,” Flexner wrote, “and for the freeing of the human spirit.” It must have seemed like one hell of a time for such an exercise. But the essay’s defense of the Ivory Tower was tough-minded and far-sighted, and Dijkgraaf’s introduction makes a case for Flexner as a major figure in the history of the American research university whose contribution should be remembered and revived.
The germ of The Usefulness of Useless Knowledge was a memorandum Flexner wrote as executive secretary of the General Education Board of the Rockefeller Foundation in 1921.The principles it espouses were also expressed in his work bringing Albert Einstein and other European academic refugees to the Institute at Princeton in the early 1930s.The essay defends “the cultivation of beauty ... [and] the extension of knowledge” as “useless form[s] of activity, in which men [and, as he acknowledges a few sentences earlier, women] indulge because they procure for themselves greater satisfactions than are otherwise available.”
But the impact of Flexner’s argument does not derive primarily from the lofty bits. He stresses that the pursuit of knowledge for its own sake has in fact shown itself already to be a powerful force in the world -- one that the ordinary person may not be able to recognize while swept up in “the angry currents of daily life.” The prime exhibits come from mathematics (Maxwell’s equations or Gauss’s non-Euclidian geometry took shape decades before practical uses could be found for them), though Flexner also points to the consequential but pure curiosity-driven work of Michael Faraday on electricity and magnetism, as well as Paul Ehrlich’s experiments with staining cellular tissue with dye.
“In the end, utility resulted,” Flexner writes, “but it was never a criterion to which [researchers’] ceaseless experimentation could be subjected.” Hence the need for institutions where pure research can be performed, even at the expense of pursuing ideas that prove invalid or inconsequential. “[W]hat I say is equally true of music and art and of every other expression of the untrammeled human spirit,” he adds, without, alas, pursing the point further.
The untrammeled human spirit requires funding in any case. Although written towards the end of the Great Depression -- and published ten years to the month after the stock market crash -- The Usefulness of Useless Knowledge reads like a manifesto for the huge expansion of higher education and of research budgets in the decades to follow.
Flexner could point to the Institute for Advanced Study with justified pride as an example of money well-spent. He probably corrected the page proofs for his essay around the same time Einstein was writing his letter to President Roosevelt, warning that the Germans might be developing an atomic bomb. And as Robbert Dijkgraaf reminds us in his introduction, another Flexner appointee was the mathematician John von Neumann, who “made Princeton a center for mathematical logic in the 1930s, attracting such luminaries as Kurt Godel and Alan Turing.” That, in turn, led to the invention of an electronic version of something Turing had speculated about in an early paper: a machine that could be programmed to prove mathematical theorems.
“A healthy and balanced ecosystem would support the full spectrum of scholarship,” Dijkgraaf writes, “nourishing a complex web of interdependencies and feedback loops.” The problem now is that such a healthy and balanced intellectual ecosystem is no less dependent on a robust economy in which considerable amounts of money are directed to basic research -- without any pressing demand for a return on investment. “The time scales can be long,” he says, “much longer than the four-year periods in which governments and corporations nowadays tend to think, let alone the 24-hour news cycle.”
That would require a culture able to distinguish between value and cost. Flexner’s essay, while very much a document from eight decades ago, still has something to say about learning the difference.
Over the last decade or so, universities around the country have been tripping over one another to see who can slap the word “entrepreneurial” on the most things on their campuses the fastest. Entrepreneurial studies curricula and majors, business incubators, entrepreneurial centers, on and on -- entrepreneurial efforts have sprung up faster than the “innovate and disrupt” start-ups scattered about Silicon Valley whom they seem to desperately want to imitate.
This “Silicon Valleyization” of the university can be seen in places like Florida State University, which recently received a record $100 million to open the Jim Moran School of Entrepreneurship, or at Rice University, which last year announced the formation of an “entrepreneurial initiative” to transform the university into an “entrepreneurial university.” Other institutions -- such as Emerson College, the University of Hartford and the University of Massachusetts at Lowell -- have also joined the entrepreneurial arms race with their own centers and curricula.
For advocates of the entrepreneurial university, such moves mark a more full alignment of higher education with the needs of the new economy. Universities are finally recognizing the central role that they now must play in spurring “endogenous growth” in the highly competitive global market, where innovation determines national, state and individual winners and losers. They are finally coming down from the ivory tower and lining up their curricula and research to meet that need. For critics, however, such developments represent yet another chapter in the capturing of the university by particular economic interests -- and a further loss of autonomy and intellectual integrity, as institutions mindlessly chase the latest fad and buzz meme.
While the entrepreneur as a particular type of economic actor in the market economy has been around for some time, entrepreneurialism as a full-blown social and cultural movement is much newer. If we situate entrepreneurialism as a historically distinct social phenomenon, or perhaps as a post-Bretton Woods economic model, it contains several assumptions about society, politics and markets that largely go unacknowledged in the frenzy to create the entrepreneurial society and the enterprising university to accompany it.
First is the profound shift from a more organized style of the market economy -- with large corporations, unionized labor, slow growth, steady-state capital and a welfare-oriented state -- to a more disorganized one composed of start-ups, flexible labor, erratic growth, impatient capital and a market-oriented state. In the newer, churning model of the market economy, the entrepreneur -- personified in cultural and political heroes like Donald Trump and Mark Zuckerberg, rather than the corporate manager or professional -- becomes the central new cultural icon.
One of the things that this unceasing push for entrepreneurial innovation as a driving force of economic growth dismisses or ignores is the actual destructive part of creative disruption. Creative disruption seems fine as long as it is other people whose lives are disrupted rather than your own. This view is often callously unconcerned with the harm that can be generated by disruption for the sake of disruption or the mantra that all innovation is progress. Here, in the mold of the economist Joseph Schumpeter and the Harvard University management gurus Clayton Christensen and Michael Porter, all disruptions are ultimately positive and all innovations are advancements. The market will miraculously, fairly and brutally sort out any lumps in the end. What disrupters yearn for is an always roiling and never resting society generated by people in continuous struggling with one another to provide the next best thing and “strike it rich.”
As Virginia Heffernan recently described it, such innovators compose a “sneakered overclass -- whose signature sport is to disrupt everything, from Ikea furniture to courtship.” They embody the religiously inspired dream of heavenly redemption, the modernist desire of continuous progress and “lotto fever” rolled into one. It is unclear, however, how far such a model can actually extend. How much innovation and disruption does the world need? Or, more important, how much can it actually take?
Second, entrepreneurialism as an idealized economic model promotes a rather distinct type of asocial, social Darwinist, “go it alone” mentality where the single, self-interested individual is seen as solely responsible for his or her successes. Here, even when philanthropy happens, it is ultimately designed for self-interest, as with Mark Zuckerberg’s Chan Zuckerberg Initiative LLC.
Even with the rhetoric of Google-style teamwork aside, the entrepreneurial model celebrates the ideal of the lone-wolf innovator who works hard, charts their own course and “defies the odds.” It is the American mythology of rugged individualism recast for the jobless age of the precariat, forever-flexible labor and the post-welfare state.
In doing so, this new economic model passes off social inequality as just the normal and inevitable ebb and flow of winners and losers in a free-flowing economy that is in constant flux -- and one that people need to adapt to rather than try to change. If you work hard enough, innovate and adapt to the market, you are entitled to reap the rewards. Those who cling to the collective protection of unions or change movements, or even the left-behind world of tradition, are but mindless sheep who lack the imagination to think for themselves and adapt. If you fail in your endeavors, you need to readapt and reinnovate in order to make your way again.
As on the TV show Shark Tank, the swirling and hungry accumulated venture capital of those “who have already made it” is there waiting to provide for newbies with the right stuff. Surviving and prospering are strictly by your own fruition. Yet all this ignores not only the highly likelihood of failure in these start-ups (90 percent, according to Forbes magazine) but also the social costs of living in a world composed of a handful of wealthy winners and scores of poor losers chumming up the shark-tank economy.
Third, implicit in the romantic idealization of the entrepreneur is the neoliberal idea of a limited pro-business government. Rather than expecting government to level things out a bit through progressive taxation or some other modest modes of redistribution -- or through various social services such as public education -- the new economic model promotes a government that is entrepreneurial, too. This enterprising government doesn’t protect people from the market as in the social democratic model but rather forces even more marketization onto them.
People must be coerced (or, in the more polite terms of behavioral economics, “nudged”) by government to “have grit and determination,” “manage their own retirements and health care,” “have positive affect” and “be responsible.” They must be calculating, self-interested and self-promotional, even if they don’t want to be. Responsibility will, in the words of former British Prime Minster David Cameron, finally force people “to ask the right questions of themselves.”
What all this means is not that entrepreneurialism is necessarily a bad thing when taken in moderation and seen within the light of a larger political economy. We can certainly acknowledge the important contributions of the loads of small businesses and innovations built on entrepreneurial principles. But an entire society or university based solely or largely on those principles is rather problematic and limiting.
Universities should be leery of aligning their curricula and research just to meet the needs of the entrepreneur. It is one thing for a higher education institution to recognize entrepreneurialism as one particular economic form but quite another to become an entrepreneurial university. Universities are -- or should be -- like the economy and society themselves: too multidimensional to remake themselves into any one particular cause of the moment.
Steven C. Ward is a professor of sociology at Western Connecticut State University.