America's smaller colleges and universities are rarely given much chance for victory in the NCAA basketball tournament. But, they call it March Madness for a reason, in large part because of the upsets when an underdog takes on the big favorite and wins. Bucknell has 3,500 students, but last year enjoyed the thrill of taking on a far larger school and succeeding, when we upset Kansas in the first round.
Now that we have earned a second straight bid to the NCAA men's basketball tournament last week, the media have praised our players' talent and tenacity. The Los Angeles Times described Bucknell as "Duke of the Susquehanna." Added the Arkansas Democrat-Gazette, "Think (of a) bigger, stronger, more talented and more athletic Princeton."
As Bucknell's president, I can tell you we mean what we say at Bucknell -- on the court and, more important, in the classroom. Our basketball success has demonstrated that impressive academic and athletic achievements are not mutually exclusive. In fact, athletics directors and academic administrators at colleges and universities around the nation need to understand the new realities of Division I basketball competition.
Earlier this week, the Institute for Diversity and Ethics in Sport at the University of Central Florida released yet another study showing a disturbing disparity between basketball players' and all students' graduation rates. Of 65 teams competing in the Big Dance, only Bucknell could boast a 100-percent graduation rate (one team, Penn, does not report such data). I want to suggest, however, that rather than remaining an anomaly in Division I athletics, Bucknell's program -- and many in the Patriot League -- be taken as a model of the next best thing in college sports.
That is, our Bison have confirmed that fielding a team of smart players can create a competitive advantage.
There are enough bright students with basketball skills who want to play major college basketball to permit schools like Bucknell, with a driving focus on quality education, to succeed in the NCAA tournament.
Consider Kevin Bettencourt, who scored a game-high 23 points in last week's Patriot League tournament final. He's an American history major who chose Bucknell for its "great academic reputation along with Division I athletics." Chris McNaughton, the 6-11 center whose graceful hook shot sent Kansas home early last March, traveled all the way from Germany to avail himself of Bucknell's nationally celebrated electrical engineering program. Kevin, Chris, and Patriot League Player of the Year Charles Lee, earned 3.4 G.P.A.s or better this fall.
When we recruit students like these, they choose us because they receive a great education and a great basketball opportunity, yet they always know that academics will come first. On Friday, Arkansas will meet a Bucknell team populated with future scientists, engineers, writers, and businessmen who, happily, also love to play basketball.
In the future, being a "big time" sports school is going to provide less competitive advantage than it used to, in part owing to the NCAA's academic reform plan. Bucknell supports reform efforts because we want all students, not just ours, to graduate with a solid education that prepares them for life.
Also, the schools that traditionally have dominated television coverage now have competition for viewers. The championship games of all the conferences -- not just the ACC, SEC, Big East, or Big Ten -- are being broadcast. And the trend is accelerating with broadband coverage and new stations such as ESPNU and CSTV.
As the lesser-known basketball programs enjoy greater exposure, quality players will increasingly opt to attend institutions like Bucknell, knowing they will have a reasonable shot at two hours (or more) of fame every March. But more important, they will join the ranks of those alumni who are CEOs, COOs, university professors, doctors, and lawyers, ensuring far more than two hours in the spotlight.
Just ask Les Moonves, a 1971 Bucknell graduate. He runs CBS, and CBS runs all the Big Dance games.
Brian C. Mitchell
Brian C. Mitchell is the president of Bucknell University in Lewisburg, Pa.
The latest rhetorical trope in the bad news presentation of U.S. higher education is to say -- even when home front improvements are acknowledged -- “Wait a minute! But other countries are doing better!" and rush out a litter of population ratios from the Organization for Economic Co-operation and Development (OECD) that show the U.S. has “fallen” from 2nd to 9th or 3rd to 15th place in whatever indicator of access, participation and attainment is at issue.
The trope is not new. It’s part and parcel of the enduring propaganda of numbers. Want to wake up a culture numbed in the newspaper maps to the Final Four, that places bets on Oscar nominees, checks the Nielson ratings weekly, and still follows the Top 40? Tell them someone big is down. In the metrics of international economic comparisons we treat trade balances, GDP, and currency exchange rates the same way, even though the World Economic Forum continues to rank the U.S. No. 1 in competitiveness, and the recent strength of the dollar should tell anyone with an ounce of common sense that the markets endorse that judgment in the midst of grave economic turmoil.
Except in matters of education, the metrics of the trope are false, and our use of them both misguided and unproductive. The Spellings Commission, ETS, ACT, the Education Commission of the States, the Alliance for Excellent Education, and, most recently, the annual litany of "Measuring Up" and the College Board’s "Coming to Our Senses" all lead off their reports and pronouncements on higher education with population ratios (and national rankings) drawn from OECD’s Education at a Glance, and assume these ratios were passed down from Mt. Sinai as the tablets by which we should be judged.
The population ratios, particularly those concerning higher education participation and attainment for the 25-34 age cohort, well serve the preferred tendency of these august bodies and their reports to engage in a national orgy of self-flagellation that purposefully neglects some very basic and obvious facts.
To be sure, U.S. higher education is not doing as well as we could or should in gross participation and attainment matters, but on the tapestry of honest international accounts, we are doing better than the propaganda allows. When you read reports from other countries’ education ministries that worry about their horrendous dropout rates and problems of access, you would think they don’t take population ratios seriously.
Indeed, they don’t, and one doesn’t need more than 4th grade math to see the problems with population ratios, particularly in the matter of the U.S., which is, by far, the most populous country among the 30 OECD member states.
None of our domestic reports using OECD data bothers to recognize the relative size of our country, or the relative diversity of races, ethnicities, nativities, religions, and native languages -- and the cultures that come with these -- that characterize our 310 million residents. Though it takes a lot to move a big ship with a motley crew, these reports all would blithely compare our educational landscape with that of Denmark, for example, a country of 5.4 million, where 91 percent of the inhabitants are of Danish descent, and 82 percent belong to the same church.
For an analogous common sense case, Japan and South Korea don’t worry about students from second language backgrounds in their educational systems. Yes, France, the UK, and Germany are both much larger and more culturally diverse than Denmark, but offer nowhere near the concentration of diversities found in the U.S. It’s not that we shouldn’t compare our records to theirs; it’s just that population ratios are not the way to do it.
OECD has used census-based population ratios to bypass a host of inconsistencies in the ways its 30 member countries report education data, but, as it turns out, the 30 member countries also employ different census methodologies, so the components of the denominator from Sweden are not identical with the components of the denominator from Australia. With the cooperation of UNESCO and Eurostat’s European Union Labor Force Survey, and occasionally drawing on microdata from what is known as the Luxembourg Income Study, OECD has made gallant efforts to overcome the inconsistencies, but you can’t catch all of them.
When ordinary folk who have no stake in education propaganda look at those 30 countries and start asking questions about fertility rates, population growth rates, net immigration rates, and growth in foreign born populations, they cannot help but observe that the U.S. lives on another planet. Only 4 countries out of the 30 show a fertility rate at or greater than replacement (2.0): France, New Zealand, Mexico, and the U.S. -- and of these, Mexico has a notable negative net migration rate. Out of those 30 countries, 7 have negative or zero population growth rates and another 5 show growth rates that might as well be zero. On the other hand, the U.S. population growth rate, at 0.9 percent, is in the top five. In net immigration through 2008, only Australia, Canada, and Ireland were ahead of us (and we count only legal immigrants). Triangulating net immigration, one can examine the percentage growth in foreign-born populations over the past 15 years. In this matter, the Migration Policy Institute shows the U.S. at 45.7 percent—which is more than double the rate for Australia and Canada (I don’t have the figures for Ireland).
It is no state secret that our immigrant population is a. young, b. largely schooled in other countries with lower compulsory schooling ages, and c. pushing the U.S. population denominator up in the age brackets subject to higher education output analysis. Looking ahead to 2025 (the College Board’s target “accountability” date), Census projections show an increase of 4.3 million in the U.S. 25-34 age bracket. Of that increase 74 percent will be Latino, and another 12 percent Asian. Can you find another country, OECD or otherwise, where an analogous phenomenon is already in the cards -- or is even somewhere in the deck, waiting to be dealt? As noted: the U.S. lives on a different demographic planet.
We are often compared with Finland in higher education matters----and to our considerable disadvantage. I will give the Finnish education system a lot of credit, particularly in its pre-collegiate sector, but the comparison is bizarre. Like Denmark, Finland is a racially and linguistically homogenous (mandatory bilingual, to be sure, in Finnish and Swedish) country of 5 million, with a population growth rate of 0.1% and a net immigration rate of 1% (principally from Eastern Europe).
In the 1990s, Finland increased the capacity of its higher education system by one-third, opening 11 new polytechnic institutions known as AMKs (for the U.S. to do something equivalent would require establishing 600 new AASCU-type 4-year colleges). So the numerator of participation in higher education increased considerably, bolstered by fully-subsidized tuition (surprise, anyone?), while the denominator remained flat. Last time you looked, what happens to percentages when numerators rise and denominators don’t?
And there is more to the Finnish comparison: the median age of entrance to higher education in Finland is 23 (compared with 19 in the U.S.) and the median age at which Finnish students earn bachelor’s degrees is 28 (compared with 24-25 in the U.S.). In our Beginning Postsecondary Students longitudinal study of 1995-2001, those entering 4-year colleges in the U.S. at age 23 or higher constituted about 5 percent of 4-year college entrants, and finished bachelor’s degrees within 6 years at a 22 percent rate (versus 65 percent for those entering directly from high school). Is comparing Finnish and U.S. higher education dynamics a fair sport? If you left it up to the folks who produced the Spellings Commission report, Measuring Up, and Coming to Our Senses, it is.
International data comparisons on higher education are very slippery territory, and nobody has really mastered them yet, though Eurostat (the statistical agency for the 27 countries in the European Union) is trying, and we are going to hear more about that at a plenary session panel of our Association for Institutional Research next June. What does one do, for example, with sub-baccalaureate degrees such as our "associate," for example? Some countries have them -- they are often called “short-cycle” degrees -- and some don’t. In some countries they can be considered terminal degrees (as we regard the associate), in other countries they are not considered higher education at all, and in still others they are regarded as part of the bachelor’s degree.
Instead of or in addition to “short-cycle” degrees, some countries offer intermediate credentials such as the Swedish Diploma, awarded after the equivalent of two-thirds of a baccalaureate curriculum. Are these comparable credentials? What’s counted and what is not counted varies from country to country. I just finished plowing through three German statistical reports on higher education from different respected German sources in which the universe of “beginning students” changed from table to table. A German friend provided a gloss on the differences, but the question of what gets into the official reporting protocol went unanswered. You can be sure that the people who put together the Spellings Commission report, Measuring Up, and Coming to our Senses never thought about such things.
Why is all this important? First, to repeat the 4th grade math, which Jane Wellman tried to bring to the attention of U.S. higher education with her Apples and Oranges in the Flat World, issued by ACE last year. When denominators are flat or declining and numerators remain stable or rise slightly, percentages rise; and vice-versa when denominators rise faster than numerators. So if you use population ratios, and include the U.S., it’s going to look like we’re “declining”—which is the preferred story of the public crisis reports. Ironically, trying to teach basic math and human geography to the U.S. college-educated adults who wrote these reports is like talking to stones. They don’t want to hear it. Wellman made a valiant effort. So did Kaiser and O’Heron in Europe in 2005 (Myths and Methods on Access and Participation in International Comparison. Twente, NL: Center for Higher Education Policy Studies), but we’re going to have to do it again.
Second, it’s like the international comparisons invoked by business columnists. The BRIC (Brazil, India, China, and Russia) countries’ GDPs have been growing much faster than ours (though some are now declining faster than ours), but none of those GDPs save that of China match the GDP of California. It’s that big ship again: the U.S. starts with a much higher base---of everything: manufacturing, productivity, technological innovation. Both growth and contraction will be slower than in economies that start from a much lower base. Where we have demonstrable faults, the most convincing reference points for improvement, the most enlightening comparisons, are to be found within our systems, not theirs. So it is with higher education, where the U.S. massified long before other countries even thought about it. Now, in a world where knowledge has no borders, if other countries are learning more, we all benefit. The U.S. does not---and should not---have a monopoly on learning or knowledge. Does anyone in the house have a problem with this?
Third, OECD itself understands the limitations of population ratios for education a lot better in 2008 than it did a scant five years ago, and is now offering such indicators as cohort survival rates in higher education. I had hoped the authors of Measuring Up 2008 might have used those rates, and read all the footnotes in OECD’s 2008 Education at a Glance so that one could see what was really comparable with what. Had they done so, they would have seen that our 6-year graduation rate for students who started full-time in a 4-year college and who graduated from any institution (not just the first institution attended) is roughly 64 percent which, compared with other OECD countries who report the same way (e.g. the Netherlands and Sweden), is pretty good (unfortunately, you have to find this datum in Appendix 3 of Education at a Glance 2008). In Coming to Our Senses, the College Board at least read the basic cohort survival rate indicator, 58 percent, but didn’t catch the critical footnote that took it to 64 percent or footnotes on periods of reporting (Sweden, for example, uses a 7 year graduation marker, not 6). Next time, I guess, we’ll have to make sure the U.S. footnotes are more prominent.
Driving this new sensibility concerning cohort survival rates, both in OECD and Eurostat, is the Bologna Process in 46 European countries, under which, depending on country, anywhere from 20 percent to 80 percent of university students are now on a 3-year bachelor’s degree cycle. Guess what happens to the numerator of graduation rates when one moves from the old four and five year degrees to new three-year? Couple this trend with declining population bases (the UK, for example, projects a drop of 13 percent in the 18-20 year-old population going forward), and some European countries’ survival rates will climb to stratospheric levels. We’ll be complaining about our continual international slippage well into the 2030s. That will suit the crisis-mongers just fine, except none of it will help us understand our own situation, or where international comparisons truly matter.
And that’s the fourth -- and most important -- point. The numbers don’t help us do what we have to do. They steer us away from the task of making the pieces of paper we award into meaningful documents, representing learning that helps our students compete in a world without borders. Instead of obsession with ratios, we should look instead to what other countries are doing to improve the efficiency and effectiveness of their higher education systems in terms of student learning and enabling their graduates to move across that world. In this respect the action lines of the Bologna Process stand out: degree qualification frameworks, a “Tuning” methodology that creates reference points for learning outcomes in the disciplines, the discipline-based benchmarking statements that tell the public precisely what learning our institutions should be accountable for, Diploma Supplements that warrantee student attainment, more flexible routes of access, and ways of identifying under-represented populations and targeting them for participation through geocoding.
These features of Bologna are already being imitated (not copied) in Latin America, Australia and North Africa. Slowly but surely they are shaping a new global paradigm for higher education, and in that respect, other countries are truly doing better. Instead of playing the slippery numbers and glitz rankings, we should be studying the substance of Bologna -- where it has succeeded, where our European colleagues have learned they still have work to do, where we can do it better within our own contexts -- perhaps experiencing an epiphany or two about how to turn the big ship on which we travel into the currents of global reform.
Now that would be a constructive use of international comparisons.
Clifford Adelman’s The Bologna Club: What U.S. Higher Education Can Learn from a Decade of European Reconstruction and Learning Accountability from Bologna: a Higher Education Policy-Primer can be found on the Web site of the Institute for Higher Education Policy, where he is a senior associate. The analysis and opinions in this essay are those of the author, and should not be interpreted as reflecting those of the institute.
In February 2009, at a meeting of the American Council on Education, I challenged a group of university presidents and other leaders of higher education to focus on the need for greater innovation in higher education. I encouraged those leaders to heed the lesson offered by George Romney to the auto industry in the 1970s to innovate or lose their advantage: “There is nothing more vulnerable than entrenched success,” he said. I followed up in October 2009 with an article in Newsweek entitled "The Three-Year Solution: How the reinvention of higher education benefits parents, students, and schools."
The response has been pleasantly surprising.
Over the past year and a half, a growing number of institutions of higher education came forward with proposals to offer three-year degrees to their students. Here are a few examples:
Grace College, in Winona Lake, Ind., is offering an accelerated three-year degree in each of its 50-plus major areas of study. Dr. Ronald Manahan, Grace's president, cites the cost of college as a driving force behind the decision. “We have listened to people’s concerns about [the cost of] higher education and we are answering them,” he said.
Chatham University, in Pittsburgh, Pa., is offering a three-year bachelor of interior architecture without summer classes, allowing students to get into the job market a year earlier. School officials have reconfigured the four-year degree by cutting Studio classes from 14 weeks to just seven, and when compared to similar programs, these students graduate two years earlier.
Texas Tech University, in Lubbock, Tex., is offering an accelerated three-year medical degree, rather than the usual four. The program is aimed at making it easier and more affordable for students to become family doctors.
As institutions of higher education look into the possibility of offering a three-year degree, some have run into federal policies that seem to interfere with their ability to innovate. For example, this May I received a letter from Jimmy Cheek, chancellor of the University of Tennessee-Knoxville, describing a potential obstacle to a three-year degree surrounding student loans.
Here’s the issue: Under the Higher Education Act, student loan limits are tightly set to prevent over-borrowing by students. Federal annual loan limits and lifetime loan limits establish a maximum amount one can borrow under the federal student loan program. The annual loan limits are designed to pay for two semesters per year (see chart below).
Example: Scheduled Academic Year
Scheduled Academic Year 1
Fall 2010 and Spring 2011
Scheduled Academic Year 2
Fall 2011 and Spring 2012
Scheduled Academic Year 3
Fall 2012 and Spring 2013
Scheduled Academic Year 4
Fall 2013 and Spring 2014
For most institutions of higher education, and most students, this works and makes sense. But 3-year degree students often take a third semester’s worth of classes over the summer. The federal limits appear to prevent students from obtaining a loan to pay for those summer courses.
Fortunately, there is a solution. Working with the Congressional Research Service, and the staff of the U.S. Department of Education, my office has identified an option that exists under current regulations to give flexibility on these loan limits to institutions of higher education and students. Instead of following a standard “Scheduled Academic Year” as outlined above, an institution of higher education offering a three-year degree could award loans to students through a “Borrower-Based Academic Year," per the chart below:
Example: Borrower-Based Academic Year
Scheduled Academic Year 1
Fall 2010 and Spring 2011
Scheduled Academic Year 2
Summer 2011 and Fall 2011
Scheduled Academic Year 3
Spring 2012 and Summer 2012
Scheduled Academic Year 4
Fall 2012 and Spring 2013
This option would use the same annual loan limits and lifetime loan limits, but compress them to match the student’s academic schedule. Compared to the typical “Fall-Spring” academic year over each of the four years, a three-year degree program could use a “Fall-Spring, Summer-Fall, Spring-Summer” structure to allow for a compressed academic schedule.
I have been told that this “Borrower-Based Academic Year” option is currently not well used because it is administratively complicated for institutions to offer both “Scheduled Academic Year” and “Borrower-Based Academic Year” loan structures at the same time for individual students. But for an institution that offers a comprehensive three-year degree program involving a number of students, this seems to make sense as a way of helping students in that program afford the tuition and fees.
I have asked Chancellor Cheek to let me know if this option would work for the University of Tennessee, or if more flexibility needs to be added. When Congress last reauthorized the Higher Education Act in 2008, we made several changes to the Pell Grant program to allow that funding to be used on a year-round basis. There is no reason students should not have that same flexibility with their student loans.
It is my hope that more institutions will explore innovative ways to provide a high-quality postsecondary education. The three-year degree is one idea for some well-prepared students, but it is vital to our competitiveness as a nation that we develop other ideas to improve the efficiency of higher education and expand access to more Americans.
Institutions of higher education are rightly feeling pressure from parents, students, state and local leaders, the business community, Congress, and the Obama administration to do a better job of providing more Americans with a quality college education at an affordable price. That pressure will likely grow more intense every year as more jobs require higher education, advanced certificates, or technological skills from their applicants.
Some have asked whether all colleges and universities should be required to offer a three-year degree. My answer is a resounding no. Just as the hybrid car isn’t for everyone, all students and all institutions won’t want a three-year degree. The last thing we need is more federal mandates on higher education.
The strength of our higher education system is that we have 6,000 independent, autonomous institutions that compete in the marketplace for students. It is that marketplace that needs to develop the new ideas for the future -- and not become a victim of its own “entrenched success" -- so that our students, and our country, can continue to thrive.
Senator Lamar Alexander
Sen. Lamar Alexander (R-Tenn.) is chairman of the Senate Republican Conference and a member of the Senate Committee on Health, Education, Labor and Pensions. He served as U.S. secretary of education under President George H.W. Bush and as president of the University of Tennessee.
When college presidents and other higher education leaders talk about federal policy these days, the most common theme is dismay at proposed new regulations from the Department of Education. But a close second is the inadequacy of data from the Education Department’s Integrated Postsecondary Education Data System (IPEDS) for evaluating anything.
This is a problem that has vexed us for years, and it's time for us to do something about it.
Every sector is affected. Colleges with many students transferring out to other colleges complain that even when those men and women graduate from the second institution, they still count as failures for their first college.
Universities with large numbers of entering transfer students know that even when they graduate they will not count as successes anywhere in IPEDS accounting. Juniors entering with degrees from community colleges will not help the statistics of their new university when they receive a B.A. or B.S.
Colleges with large percentages of part-time and commuter students know that they normally take longer than six years to graduate. Everyone reminds each other that a large percentage of Americans graduating from college now have credits from more than one institution, often more than two institutions. Many people taking courses at community colleges do not intend to complete degree programs.
Yet six-year graduation rates from the first point of entry are the only figures we seem to have for evaluating completion success. IPEDS data are not useful for management purposes, but they can be outright dangerous for policy making, particularly if leading to conclusions that whole segments of our country can be written off as not college-worthy. The figures are least reliable for low-income populations who do have to “stop out” some semesters, who are more likely to attend part time, more likely to need time for pre-college courses because of weak high schools, more likely to transfer, and more at risk.
So, rather than leaving this for the U.S. Department of Education to fix, I am challenging colleagues in higher education to design an alternative system that is more valid, reliable, and useful.
My institution, Heritage University, in the Yakima Valley of Washington, is one of the institutions fully committed to creating opportunities for a region’s underserved, low-income, largely minority and almost entirely first-generation-college population that, by and large, has not been well-prepared by local high schools. Until my arrival last summer, Heritage’s founding and only president, Kathleen Ross, had for 28 years been building an inspirational learning environment with thousands of success stories from that population. Many of those graduates are not only productive citizens but also leaders in the Pacific Northwest, reaching goals no one would have imagined possible for them before they came to Heritage.
Most Heritage students, to be sure, do need pre-college developmental work; almost all have to hold jobs; many have to “stop out” for a semester from time to time. Some 70 percent are women, many of them single parents determined to raise their families up out of poverty. Graduation figures in the IPEDS data for those who entered at the start of the last decade look miserable at first glance, something like 18 percent in six years. A certain portion of that deficiency derives from Heritage having had in its early years an enrollment policy a bit too close to open enrollment for a college with high standards.
The history of Heritage has been, in effect, a search to understand which students can be remediated to do rigorous college work and which, despite a high school diploma and a respectable grade point average, lack the academic skills and work ethic to succeed. As a consequence Heritage, now with much time-tested data at its disposal, is advising a number of applicants in other directions; is developing stronger pre-college modules for those with ability and commitment to succeed; and is investing in robust advising to complement academic rigor.
One might hope that Rich Vedder, who in a recent Forbes blog post suggested Heritage might best be shut down for wasting Pell Grant dollars, would reconsider that conclusion and decide that Heritage is actually a very good Pell investment in America’s future.
For if he and others study the data more closely, they'll also learn that of those students who actually matriculated as full-fledged freshmen between 2003 and 2005 -- that is, students who had completed any necessary remedial work -- the 8-year graduation rate was 41 percent, not including those who transferred to another college. Of those who successfully became sophomores at Heritage, the graduation rate was 81 percent. Of those who became juniors, as well as those who transferred in as juniors from community colleges, the graduation rate was 81 percent. In each of those last three data sets, Heritage University compares quite favorably with other colleges that have comparable populations. Hundreds of other colleges, moreover, have good stories to tell if they can use metrics that are truer to and more relevant to actual performance than are the IPEDS data.
So Heritage is now developing a metric to assign to every entering student -- based on credits transferred, remediation needed, and planned full-time or part-time schedule -- a predictive graduation date, a benchmark against which success can be measured, with a factor also to account for those known to have transferred to another college.
This is the time, however, to challenge all of us in higher education -- the presidential associations, those who oversee accreditation, and other higher education organizations -- to come together to propose an alternative to IPEDS, or at least a parallel system, that colleges and universities themselves find useful for management and that policy makers can trust.
It must account for transfer patterns, for differential rates of progress among low-income populations, for developmental needs of students, and for the wide array of kinds of institutions in American higher education. It is complex but it is doable. It will give all of us a better system for measuring completion success rates.
John Bassett is president of Heritage University, in Yakima, Wash.
Most of the action in American educational policy happens in the states. Their governments are primarily responsible for elementary and secondary education, and the vast majority of students in the United States attend public institutions that are also funded and governed primarily at the state level. So any efforts to improve the interaction between the public schools system and higher education, and to ease the transition of students from one to another to ensure their academic success, will live and die largely at the state level.