You have /5 articles left.
Sign up for a free account or log in.
The latest rhetorical trope in the bad news presentation of U.S. higher education is to say -- even when home front improvements are acknowledged -- “Wait a minute! But other countries are doing better!" and rush out a litter of population ratios from the Organization for Economic Co-operation and Development (OECD) that show the U.S. has “fallen” from 2nd to 9th or 3rd to 15th place in whatever indicator of access, participation and attainment is at issue.
The trope is not new. It’s part and parcel of the enduring propaganda of numbers. Want to wake up a culture numbed in the newspaper maps to the Final Four, that places bets on Oscar nominees, checks the Nielson ratings weekly, and still follows the Top 40? Tell them someone big is down. In the metrics of international economic comparisons we treat trade balances, GDP, and currency exchange rates the same way, even though the World Economic Forum continues to rank the U.S. No. 1 in competitiveness, and the recent strength of the dollar should tell anyone with an ounce of common sense that the markets endorse that judgment in the midst of grave economic turmoil.
Except in matters of education, the metrics of the trope are false, and our use of them both misguided and unproductive. The Spellings Commission, ETS, ACT, the Education Commission of the States, the Alliance for Excellent Education, and, most recently, the annual litany of "Measuring Up" and the College Board’s "Coming to Our Senses" all lead off their reports and pronouncements on higher education with population ratios (and national rankings) drawn from OECD’s Education at a Glance, and assume these ratios were passed down from Mt. Sinai as the tablets by which we should be judged.
The population ratios, particularly those concerning higher education participation and attainment for the 25-34 age cohort, well serve the preferred tendency of these august bodies and their reports to engage in a national orgy of self-flagellation that purposefully neglects some very basic and obvious facts.
To be sure, U.S. higher education is not doing as well as we could or should in gross participation and attainment matters, but on the tapestry of honest international accounts, we are doing better than the propaganda allows. When you read reports from other countries’ education ministries that worry about their horrendous dropout rates and problems of access, you would think they don’t take population ratios seriously.
Indeed, they don’t, and one doesn’t need more than 4th grade math to see the problems with population ratios, particularly in the matter of the U.S., which is, by far, the most populous country among the 30 OECD member states.
None of our domestic reports using OECD data bothers to recognize the relative size of our country, or the relative diversity of races, ethnicities, nativities, religions, and native languages -- and the cultures that come with these -- that characterize our 310 million residents. Though it takes a lot to move a big ship with a motley crew, these reports all would blithely compare our educational landscape with that of Denmark, for example, a country of 5.4 million, where 91 percent of the inhabitants are of Danish descent, and 82 percent belong to the same church.
For an analogous common sense case, Japan and South Korea don’t worry about students from second language backgrounds in their educational systems. Yes, France, the UK, and Germany are both much larger and more culturally diverse than Denmark, but offer nowhere near the concentration of diversities found in the U.S. It’s not that we shouldn’t compare our records to theirs; it’s just that population ratios are not the way to do it.
OECD has used census-based population ratios to bypass a host of inconsistencies in the ways its 30 member countries report education data, but, as it turns out, the 30 member countries also employ different census methodologies, so the components of the denominator from Sweden are not identical with the components of the denominator from Australia. With the cooperation of UNESCO and Eurostat’s European Union Labor Force Survey, and occasionally drawing on microdata from what is known as the Luxembourg Income Study, OECD has made gallant efforts to overcome the inconsistencies, but you can’t catch all of them.
When ordinary folk who have no stake in education propaganda look at those 30 countries and start asking questions about fertility rates, population growth rates, net immigration rates, and growth in foreign born populations, they cannot help but observe that the U.S. lives on another planet. Only 4 countries out of the 30 show a fertility rate at or greater than replacement (2.0): France, New Zealand, Mexico, and the U.S. -- and of these, Mexico has a notable negative net migration rate. Out of those 30 countries, 7 have negative or zero population growth rates and another 5 show growth rates that might as well be zero. On the other hand, the U.S. population growth rate, at 0.9 percent, is in the top five. In net immigration through 2008, only Australia, Canada, and Ireland were ahead of us (and we count only legal immigrants). Triangulating net immigration, one can examine the percentage growth in foreign-born populations over the past 15 years. In this matter, the Migration Policy Institute shows the U.S. at 45.7 percent—which is more than double the rate for Australia and Canada (I don’t have the figures for Ireland).
It is no state secret that our immigrant population is a. young, b. largely schooled in other countries with lower compulsory schooling ages, and c. pushing the U.S. population denominator up in the age brackets subject to higher education output analysis. Looking ahead to 2025 (the College Board’s target “accountability” date), Census projections show an increase of 4.3 million in the U.S. 25-34 age bracket. Of that increase 74 percent will be Latino, and another 12 percent Asian. Can you find another country, OECD or otherwise, where an analogous phenomenon is already in the cards -- or is even somewhere in the deck, waiting to be dealt? As noted: the U.S. lives on a different demographic planet.
We are often compared with Finland in higher education matters----and to our considerable disadvantage. I will give the Finnish education system a lot of credit, particularly in its pre-collegiate sector, but the comparison is bizarre. Like Denmark, Finland is a racially and linguistically homogenous (mandatory bilingual, to be sure, in Finnish and Swedish) country of 5 million, with a population growth rate of 0.1% and a net immigration rate of 1% (principally from Eastern Europe).
In the 1990s, Finland increased the capacity of its higher education system by one-third, opening 11 new polytechnic institutions known as AMKs (for the U.S. to do something equivalent would require establishing 600 new AASCU-type 4-year colleges). So the numerator of participation in higher education increased considerably, bolstered by fully-subsidized tuition (surprise, anyone?), while the denominator remained flat. Last time you looked, what happens to percentages when numerators rise and denominators don’t?
And there is more to the Finnish comparison: the median age of entrance to higher education in Finland is 23 (compared with 19 in the U.S.) and the median age at which Finnish students earn bachelor’s degrees is 28 (compared with 24-25 in the U.S.). In our Beginning Postsecondary Students longitudinal study of 1995-2001, those entering 4-year colleges in the U.S. at age 23 or higher constituted about 5 percent of 4-year college entrants, and finished bachelor’s degrees within 6 years at a 22 percent rate (versus 65 percent for those entering directly from high school). Is comparing Finnish and U.S. higher education dynamics a fair sport? If you left it up to the folks who produced the Spellings Commission report, Measuring Up, and Coming to Our Senses, it is.
International data comparisons on higher education are very slippery territory, and nobody has really mastered them yet, though Eurostat (the statistical agency for the 27 countries in the European Union) is trying, and we are going to hear more about that at a plenary session panel of our Association for Institutional Research next June. What does one do, for example, with sub-baccalaureate degrees such as our "associate," for example? Some countries have them -- they are often called “short-cycle” degrees -- and some don’t. In some countries they can be considered terminal degrees (as we regard the associate), in other countries they are not considered higher education at all, and in still others they are regarded as part of the bachelor’s degree.
Instead of or in addition to “short-cycle” degrees, some countries offer intermediate credentials such as the Swedish Diploma, awarded after the equivalent of two-thirds of a baccalaureate curriculum. Are these comparable credentials? What’s counted and what is not counted varies from country to country. I just finished plowing through three German statistical reports on higher education from different respected German sources in which the universe of “beginning students” changed from table to table. A German friend provided a gloss on the differences, but the question of what gets into the official reporting protocol went unanswered. You can be sure that the people who put together the Spellings Commission report, Measuring Up, and Coming to our Senses never thought about such things.
Why is all this important? First, to repeat the 4th grade math, which Jane Wellman tried to bring to the attention of U.S. higher education with her Apples and Oranges in the Flat World, issued by ACE last year. When denominators are flat or declining and numerators remain stable or rise slightly, percentages rise; and vice-versa when denominators rise faster than numerators. So if you use population ratios, and include the U.S., it’s going to look like we’re “declining”—which is the preferred story of the public crisis reports. Ironically, trying to teach basic math and human geography to the U.S. college-educated adults who wrote these reports is like talking to stones. They don’t want to hear it. Wellman made a valiant effort. So did Kaiser and O’Heron in Europe in 2005 (Myths and Methods on Access and Participation in International Comparison. Twente, NL: Center for Higher Education Policy Studies), but we’re going to have to do it again.
Second, it’s like the international comparisons invoked by business columnists. The BRIC (Brazil, India, China, and Russia) countries’ GDPs have been growing much faster than ours (though some are now declining faster than ours), but none of those GDPs save that of China match the GDP of California. It’s that big ship again: the U.S. starts with a much higher base---of everything: manufacturing, productivity, technological innovation. Both growth and contraction will be slower than in economies that start from a much lower base. Where we have demonstrable faults, the most convincing reference points for improvement, the most enlightening comparisons, are to be found within our systems, not theirs. So it is with higher education, where the U.S. massified long before other countries even thought about it. Now, in a world where knowledge has no borders, if other countries are learning more, we all benefit. The U.S. does not---and should not---have a monopoly on learning or knowledge. Does anyone in the house have a problem with this?
Third, OECD itself understands the limitations of population ratios for education a lot better in 2008 than it did a scant five years ago, and is now offering such indicators as cohort survival rates in higher education. I had hoped the authors of Measuring Up 2008 might have used those rates, and read all the footnotes in OECD’s 2008 Education at a Glance so that one could see what was really comparable with what. Had they done so, they would have seen that our 6-year graduation rate for students who started full-time in a 4-year college and who graduated from any institution (not just the first institution attended) is roughly 64 percent which, compared with other OECD countries who report the same way (e.g. the Netherlands and Sweden), is pretty good (unfortunately, you have to find this datum in Appendix 3 of Education at a Glance 2008). In Coming to Our Senses, the College Board at least read the basic cohort survival rate indicator, 58 percent, but didn’t catch the critical footnote that took it to 64 percent or footnotes on periods of reporting (Sweden, for example, uses a 7 year graduation marker, not 6). Next time, I guess, we’ll have to make sure the U.S. footnotes are more prominent.
Driving this new sensibility concerning cohort survival rates, both in OECD and Eurostat, is the Bologna Process in 46 European countries, under which, depending on country, anywhere from 20 percent to 80 percent of university students are now on a 3-year bachelor’s degree cycle. Guess what happens to the numerator of graduation rates when one moves from the old four and five year degrees to new three-year? Couple this trend with declining population bases (the UK, for example, projects a drop of 13 percent in the 18-20 year-old population going forward), and some European countries’ survival rates will climb to stratospheric levels. We’ll be complaining about our continual international slippage well into the 2030s. That will suit the crisis-mongers just fine, except none of it will help us understand our own situation, or where international comparisons truly matter.
And that’s the fourth -- and most important -- point. The numbers don’t help us do what we have to do. They steer us away from the task of making the pieces of paper we award into meaningful documents, representing learning that helps our students compete in a world without borders. Instead of obsession with ratios, we should look instead to what other countries are doing to improve the efficiency and effectiveness of their higher education systems in terms of student learning and enabling their graduates to move across that world. In this respect the action lines of the Bologna Process stand out: degree qualification frameworks, a “Tuning” methodology that creates reference points for learning outcomes in the disciplines, the discipline-based benchmarking statements that tell the public precisely what learning our institutions should be accountable for, Diploma Supplements that warrantee student attainment, more flexible routes of access, and ways of identifying under-represented populations and targeting them for participation through geocoding.
These features of Bologna are already being imitated (not copied) in Latin America, Australia and North Africa. Slowly but surely they are shaping a new global paradigm for higher education, and in that respect, other countries are truly doing better. Instead of playing the slippery numbers and glitz rankings, we should be studying the substance of Bologna -- where it has succeeded, where our European colleagues have learned they still have work to do, where we can do it better within our own contexts -- perhaps experiencing an epiphany or two about how to turn the big ship on which we travel into the currents of global reform.
Now that would be a constructive use of international comparisons.