You have /5 articles left.
Sign up for a free account or log in.

Duke University recently announced that it will no longer ask job applicants about their criminal histories. Duke’s move follows the Common Application’s August decision to drop a question inquiring about students’ criminal history. For prospective employees and students alike, the push to “ban the box” reflects a healthy desire to strike down barriers that may impede social mobility. Yet, oft overlooked in all of this, especially within higher education, is the way in which college degrees serve as an impediment to opportunity.

Of course, at its best, higher education is a powerful engine of opportunity and socioeconomic advancement. And that’s the way it’s almost universally described. Nevertheless, for too many Americans, the truth is that postsecondary education is principally a toll: an ever-more expensive, two-, four- or (let’s be honest) six-year pit stop to employment that is increasingly mandated, gratuitously, by employers’ HR departments.

Today, thousands of employers routinely use college degrees as a convenient way to screen and hire job applicants, even when postsecondary credentials bear no obvious connection to job duties or performance. In a comprehensive report last year, researchers from Harvard Business School documented increasing “degree inflation” -- as employers demand baccalaureate degrees for middle-skill jobs that don’t obviously require one. The researchers estimated that this phenomenon encompassed more than six million jobs across dozens of industries. In fact, nearly two-thirds of employers surveyed admitted to having rejected applicants with the requisite skills and experience simply because they lacked a college degree.

Degree requirements are proliferating absent evidence they correlate with job necessity -- and, indeed, despite some evidence to the contrary. A 2014 survey conducted by Burning Glass Technologies found that employers are increasingly requiring bachelor’s degrees for positions whose current workers don’t have one and where the requisite skills haven’t changed. Employer preference for degrees is rising even for entry-level occupations, like IT help-desk technicians, where the job postings do not include skills typically taught at the baccalaureate level, and there is little to no difference in requested skill sets for postings requiring a college degree compared to those that do not.

Now, it’s important to clarify that while colleges and universities are the primary beneficiary of degree inflation, much of the responsibility for it lies elsewhere. Instead, this is largely a product of employer convenience and the unintended consequences of federal antidiscrimination law.

Title VII of the Civil Rights Act of 1964 prohibited employers from discriminating against job applicants on the basis of race, color, religion, sex or national origin. It did, however, allow employers to use “professionally developed” hiring tests, insofar as they were not “designed, intended or used” to discriminate. In Griggs v. Duke Power Company (1971), the Supreme Court unanimously interpreted this language to mean that when a selection process disproportionately affects minority groups (e.g. has a “disparate impact”), employers must show that any requirements are directly job related and an accurate predictor of job performance.

This “disparate impact” standard, which Congress codified in federal law, nominally applies to all criteria used in making employment decisions, including educational requirements. Crucially, however, this standard has only been scrupulously applied to other, noneducational employment tests. Employers using IQ tests to screen and hire applicants, for example, must use approved, professionally developed tests and justify IQ thresholds. That is, if companies require job applicants to possess an IQ of 110, they must be able to demonstrate why an applicant with an IQ of 109 is incapable of performing a job that someone with a 110 IQ can. One need only read that sentence to understand why human-resource lawyers quiver in horror when executives ask about using that kind of screening test.

Even directly applicable employment tests can run afoul of federal regulators. Last year, for instance, the Equal Employment Opportunity Commission (EEOC) sued the railroad company CSX Transportation for discrimination, because male job applicants passed the company’s physical-fitness tests at a disproportionately higher rate than female applicants. Even though the test was stipulated to be “job related” (since employees were required to lift heavy objects) and “consistent with business necessity,” the EEOC still required CSX to adopt “alternative practices that have less adverse impact.”

College-degree requirements, meanwhile, have escaped scrutiny. In turn, risk-averse employers have become increasingly reliant upon them as an expedient way to screen applicants while avoiding the legal pitfalls accompanying other employment tests. For employers, the logic is simple: a college degree is an easy-to-read signal that an applicant likely possesses a desirable bundle of behaviors and social capital -- such as the ability to turn in work, sit still for long periods, take direction and so forth -- in addition to confirming the baseline verbal and written skills required for most jobs.

Ironically, indiscriminate degree requirements carry obvious disparate-impact implications, making their casual acceptance all the more remarkable. Indeed, the Harvard report noted that the practice disproportionately harms groups with low college graduation rates, particularly blacks and Hispanics.

The burdens of credential inflation, of course, fall most heavily on those of modest means -- heightening the obstacles for low-income and working-class individuals. Degree requirements summarily disqualify noncredentialed workers with relevant skills and experience from attractive jobs. They bar young people from taking entry-level jobs and building the expertise and abilities that open up new opportunities. And they hold families and would-be workers hostage, forcing them to devote time and money toward degree collecting, whether or not those credentials actually convey much in the way of relevant skills or knowledge.

Those intent on ensuring that higher education is more of an engine of individual opportunity than a security blanket for businesses would do well to consider the part colleges play, however passively, in all of this. What might be done? Well, in postsecondary education, there is an overdue opportunity to develop alternative credentialing models and devise new ways to credibly certify aptitudes and skills. Most important, there’s a need to ask where and how institutions may be complicit in enabling statutory and legal practices that compel students to unnecessarily enter college -- not because they want or need the things a college degree represents, but because they fear being denied good jobs based on their failure to buy a piece of paper.

Diplomas are “useful servants,” Chief Justice Warren Burger wrote in Griggs, but “they are not to become masters of reality.” Academe should consider its role in permitting diplomas to become the capricious masters of opportunity.

Next Story

More from Views