What do new auto sales data and the U.S. Education Department’s Digest of Education Statistics  have in common?
Both are eagerly anticipated and extensively covered by the news media – particularly during spring months. Both contain key indicators necessary for informed, high-stakes decisions that affect the nation’s economy. But that’s where the similarities end.
The Digest of Education Statistics is an indispensable handbook for education analysts around the country, with detailed information on everything from elementary school enrollments to postsecondary institutions' revenue sources. Unfortunately the data are not very current or granular. While graduation season for 2010-11 is coming to a close at most institutions around the country, the latest national data for the aggregate number of students earning degrees are from 2008-9 -- two full years ago.
In contrast, the auto sales report shows sales by manufacturer and car models monthly. This week we learn that auto sales fell in May for the first time in eight months due to a combination of high gas prices and a shortage of fuel-efficient models created by Japan’s record earthquake. Industry-wide sales fell 3.7 percent to 1.06 million vehicles sold in May, down from 1.1 million a year earlier. The Japanese car shortage benefited Chrysler, which enjoyed a 10 percent increase in sales.
These are the kinds of data carmakers look at constantly to help make decisions about production volume, plant closings (or openings), and pricing. Policymakers are also looking at this data to see whether the bailout of the auto industry was worth the price tag, and whether tax incentives and mileage regulations are changing the mix of models sold in the U.S.
Similar data on bachelor's and associate degrees awarded, or engineering and music degrees, are nearly two years old. So what do we know about the kind of short-term credentials students turned to in response to the nose-diving economy? Or about which states and institutions stepped up their game to meet the ambitious goal President Obama set to have the highest proportion of college graduates in the world by 2020? Not much.
Further, there is an important debate right now between those who see higher education's mission primarily as preparing graduates for jobs and economic success, and others who worry about the erosion of general education in today's colleges and universities. Tracking the number of jobs created each month as the economy recovers serves as a starting point for talking about the quality of those jobs -- are they temporary jobs, low-paid service-sector jobs, highly skilled professional jobs, jobs in new businesses or old industries?
We should be having a similar national conversation about our college graduates. Wouldn't it be nice to know whether or not there have been any recent changes, perhaps as a result of the economy or of external pressures on colleges, in the mix of vocationally oriented and broader liberal arts degrees awarded around the country?
Granted, college graduates are not cars, and industrial analogies to higher education often ring hollow. But aren’t they much more important? While higher education represents an increasing share of students’ budgets and of the American economy, and is a linchpin to job creation, we’re largely navigating in the dark. New governors and legislators are trying to get their bearings and figure out how to meet ambitious goals with the scarce resources available. A new Congress is starting to talk about whether we can afford Pell grants as we currently know them. And they’re having those conversations with two-year-old data. Imagine setting national policy for the car industry based on GM sales in June 2009. Something needs to change.
Reliable Estimates Are Possible in the Current System
As a former state-level official who was responsible for getting this kind of data out in Florida, I understand some of the reasons for the delay in reporting college completion data. The university system I worked for could be only as fast and as accurate as its slowest and least accurate member.
So imagine the situation of the federal National Center for Education Statistics (NCES), where instead of the 11 universities we had to worry about, they have 7,000. As I write this, someone at NCES or one of its contractors is still trying to figure out how a new beauty school in Alaska managed to award 50 nuclear engineering doctorates last year -- or something similarly strange.
But that size is also an advantage. If Florida State University didn’t report reliable numbers, the accuracy of any state-level report would be severely compromised. If Florida State doesn’t report to NCES, on the other hand, it’s a rounding error. And the Florida States of the world rarely cause problems -- it’s the rural community college that just lost its lone institutional researcher, or the new beauty school in Alaska that doesn’t yet have one.
Releasing Data Earlier Is Possible
Reporting more current college completion data is possible. Here’s how. NCES collects data on degrees awarded once a year. The deadline for 2009-10 completions was last October 28. By early February, preliminary numbers are available for institutional researchers’ use, but the final numbers are released in the summer, a full year after the prior year’s students have actually graduated.
The preliminary numbers available in February have proven to be a reliable estimate. In my experience, there is usually very little difference between the totals that can be calculated with the preliminary data and what eventually comes out several months later. In fact, reliable estimates for the nation and for most states could probably be reported as early as November or December, based on “early returns” from the fall survey. While that’s still a few months after the peak of graduation season, it’s gaining nearly two years of valuable time and information.
Reporting more current college completion data is worth doing. Some states recognize this and make every effort to get data out early. Kentucky, for example, recently reported  that public and private credentials awarded in that state are up 11 percent in 2010-11, largely at the associate and certificate level. Such timely reporting is noteworthy – and rare. But even precocious data gatherers like Kentucky will find their numbers hard to interpret without knowing whether the trend in other states is up 15 percent or down 3 percent. And it will remain a strictly local news story, timed differently in every state, rather than an occasion for national reflection on the state of higher education. By contrast, consider the healthy competition for business and jobs under way among governors. When new unemployment numbers come out weekly, governors aren't just looking at their own states, they’re measuring themselves against their neighbors and against the nation as a whole.
The timely national release of top-line completion numbers would put a day on the calendar to spark a recurring national discussion about how we’re doing across state lines and relative to one another and to our ambitious goals. We can imagine states and colleges themselves vying to be among the top-performing institutions and using it in their marketing and recruitment efforts. The competition for new jobs is fierce among states, and a number of governors have tried using tax policy to poach business from other states. Wouldn’t it be nice to see similar competition based on recent state trends in numbers of highly skilled graduates?
Complete College Completion Data
In addition to being more timely, it’s important for college completion data to be more comprehensive. Most states have good data on their public institutions’ graduates well before NCES releases national numbers, but information on private colleges (both nonprofit and for-profit) is spotty. And yet none of the big attainment goals set by states, the White House, or the Lumina Foundation for Education can be achieved unless private higher education contributes a big share of the needed graduates.
Making good strategic higher education decisions at the state level requires analysis of both public and private institution data. Perhaps a steep drop in nursing graduates at public colleges is spurring discussion of financial incentives to graduate more RNs. But if nursing degrees at private institutions are booming, that may not seem such a wise use of public funds. And if the trend is the same at private colleges, then perhaps the incentives should be available there as well.
As our state and federal elected officials continue making difficult policy and budget choices, we should hope that they are doing so based on data that are current and that bring to the surface trends in higher education that can guide informed, effective budgeting and policy making. If we can generate detailed auto sales data monthly, unemployment claims weekly, and stock market updates by the second, we should be able to produce college completion data sooner than two years after the fact.
Nate Johnson served as executive director of planning and analysis for the State University System of Florida and as associate director of institutional research at the University of Florida. He is currently a senior consultant for HCM Strategists, a health and education public policy and advocacy firm.