The United States economy is at risk because Indian and Chinese universities are educating more engineers than American institutions, or because some European countries have graduation rates that put those of American institutions to shame, or because of … (fill in the blank with the comparison of your choice).
You’ve heard the comparisons. And many times -- without thinking much about the validity of the claims -- you may have been pleased by the rhetoric. After all, the figures are usually cited to emphasize the importance of higher education, and sometimes even to justify the need for more spending on higher education or for more rigorous preparation of students -- generally causes supported by academics.
But a report released today suggests that many of the most commonly cited figures are highly questionable, based on the sort of apples and oranges comparisons that statisticians should have rebelled against years ago. In a number of cases, the flaws may overstate significant problems in American higher education. In many other cases, the flaws may render data valueless for promoting the kinds of education reforms that are needed, the study says.
“The Spaces Between Numbers: Getting International Data on Higher Education Straight,” being released today by the Institute for Higher Education Policy, is by no means opposed to making international comparisons or to using data to inform policy. The author is Clifford Adelman, who before moving to the institute spent his career as an education statistics guru at the U.S. Education Department.
Adelman’s argument in fact is that the United States (and other countries) are losing the chance to benefit from comparisons because the flaws are so significant. And he suggests new approaches to data collection and analysis that could improve on the current system.
While the report’s criticism extends to a number of international projects, the prime target is the much cited data released by the Organization of Economic and Cooperative Development. Because the OECD’s members are economically advanced democracies, they are presumed to be a good comparison group, Adelman notes. But he argues that they aren’t. (OECD's press office did not respond to requests for comment.)
Among the criticisms raised in the study:
- Degree completion data are flawed by comparing community colleges in the United States (where many students never intend to seek degrees) with students in European countries that have institutions offering sub-baccalaureate degrees, but that enroll only students seeking such degrees.
- Another flaw in degree completion analysis is that data from the United States count only students who finish degrees at the institution they first enrolled in, while almost all other countries count “system” completion, giving credit for those who transfer. (The completion rate for the most recent OECD comparisons would have grown to 63 percent from 56 percent, had an equal comparison been used, but that figure was relegated to a footnote, the report says.)
- Foreign students are counted in some countries’ totals and not in others -- a significant factor in some European countries that have high enrollments from other countries.
- The definition of “beginning students” is so vague that some countries count students who have already earned postsecondary degrees.
- The OECD counts all of a country’s bachelor’s degrees the same way, regardless of whether they are earned in three, four or five years.
- Various ratios in data do not reflect significant changes such as population spurts or significant expansions of higher education systems. As a result, shifts in ratios -- of students or graduates per institution or the population -- have little value.
All of these problems result in “negative propaganda,” particularly about the United States and its educational quality, the report says.
In an interview, Adelman said that he didn’t think there was an ideological agenda behind the flaws, and that the issues are rather a reflection of institutional inertia. But the result is “numbers that are empty and void -- numbers that have no real meaning.”
While most of the report focuses on issues related to the OECD, Adelman also faults the U.S. Education Department for some of its data collection systems, in particular as they relate to community colleges. For instance, he notes a similar issue to the OECD use of data: The department tracks community college students as if every single one is a degree candidate. “They are not, and everyone who has worked over community college data knows that,” the report says. Why not separate data, he asks, for those in degree programs, for those in job training programs, for those in strictly remedial programs, and so forth?
How to fix the system? Adelman suggests many changes throughout the report to present more accurate comparison groups. But he also suggests some large shifts in how information is gathered.
One major change he suggests is an effort to reflect the way different countries divide their populations into “qualifying” students for higher education, and those who don’t qualify. Whether by national examination or high school attended or various other measures, data are influenced by who is really in the pool of potential students, he writes.
A second key change needed, he writes, is to study measures of “inclusiveness.” While OECD countries vary in which populations are disadvantaged or have been historically excluded from higher education, they all have such groups. And he writes that a key issue for all of these countries’ higher education systems should be the way they are reaching these populations.
The United States tends to focus on race and ethnicity, given the history of discrimination against various groups, and on wealth, since tuition is a factor in the United States. Some OECD countries have no tuition, and no similar history to the United States with regard to race, and so track disadvantaged populations in a range of ways -- such as geography. All nations could benefit, the report says, by learning to better identify and track such populations and to compete in narrowing gaps among groups.
Read more by
Today’s News from Inside Higher Ed
Inside Higher Ed’s Quick Takes
What Others Are Reading