You have /5 articles left.
Sign up for a free account or log in.

Ronald Reagan famously used the phrase “Trust, but verify” to describe his posture toward discussing nuclear disarmament with the Soviet Union.

His use of that phrase was brilliant on a couple of levels. Talking about trusting an adversary was on one level an expression of good faith, but adding verification made it clear that any idealism was also tempered with a dose of realism. The additional genius of the phrase as applied to the Soviets was that it was adapted from a Russian proverb, “Doveryai, no proveryai.

The Operation Varsity Blues scandal trials currently taking place serve as a reminder of the fine line between trust and verification in college admission. Colleges trust applicants to be honest and truthful in what they list on their applications. While we wouldn’t want that to change, Operation Varsity Blues serves as a cautionary tale. The widespread fraud, including constructing elaborate and false athletic résumés for sports the students involved didn’t even play, was not uncovered by admission offices. Fool me once, shame on you. Fool me twice …

The trial is not the only “ripped from the headlines” story that provides a test case for the interplay of trust and verification. Last week U.S. News & World Report published its annual “America’s Best Colleges” rankings. I have since received numerous emails from colleges trumpeting their ranking, and my local newspaper has published its annual story highlighting small changes in local institutional rankings as if they signified major news.

This year there was considerable speculation about how U.S. News would treat test scores in its rankings recipe, given the rise of test-optional policies during the last admissions cycle. U.S. News resisted calls to remove consideration of test scores from the formula. Colleges receive full credit for test scores if 50 percent of entrants reported scores (the figure had previously been 75 percent). Colleges where fewer than 50 percent of entrants submitted scores received a discount of 15 percent in the impact of scores on its ranking. According to U.S. News, that affected 4 percent of institutions.

The focus on how many places Wossamotta U (alma mater of Bullwinkle J. Moose) may have moved up or down in the rankings, and the attention given to minor changes in the U.S. News methodology, may be obscuring a more important issue.

Over the weekend I was doing research on the relationship between admissions selectivity (rejectivity may be the better term) and prestige, thinking about how number of applications and admit rate drive institutional behavior. In the course of the research, I stumbled upon a U.S. News list of the Top 100 colleges with the lowest acceptance rates according to the 2022 rankings.

That list included eight institutions identified as having admit rates below 20 percent that I found surprising. Alice Lloyd College in Pippa Passes, Ky., was listed as having a 7 percent admit rate, making it seemingly as selective as the Massachusetts Institute of Technology and Yale University. The other surprises include the University of Science and Arts of Oklahoma (13 percent); the College of the Ozarks in Missouri, Limestone University in South Carolina and Ottawa University in Kansas, all at 14 percent; Wiley College in Texas and Bacone College in Oklahoma (15 percent); and Texas Wesleyan University (19 percent).

As already stated, I was surprised by, and perhaps even suspicious of, those numbers. All are regional institutions that serve a valuable role in the landscape of higher education, but it seems odd that they would be as selective as the national universities and liberal arts colleges that populate the U.S. News list.

Eight or nine years ago, I recall some colleges doing creative accounting to lower their admit rate, counting inquiries as applications. At that time one college corrected data regarding applications received and students admitted that resulted in changing its admit rate from 27.4 percent to 89.1 percent. That institution explained the discrepancy as “counting in a different way.” U.S. News subsequently moved that college into the “unranked” category. For the record, I wish U.S. News would place all colleges and universities in the unranked category.

I was intrigued by the reported low admit rates for those eight schools and decided to follow up by comparing the U.S. News data with the data for each school on the Common Data Set (a collaborative initiative jointly sponsored by the College Board, U.S. News and Peterson’s) and IPEDS (Integrated Postsecondary Education Data System), an arm of the National Center for Education Statistics that is part of the U.S. Department of Education. Any institution receiving federal aid is required to report data in a number of areas, and I assume there are significant consequences for reporting false information.

It will probably nor surprise readers that I found discrepancies between what U.S. News is showing and what was reported to IPEDS, for there would no reason to write about this if all the data squared. With a couple of exceptions, the IPEDS reporting for each college varies significantly from what U.S. News shows.

According to the IPEDS data for 2019-20, Alice Lloyd’s admit rate is 28 percent, not 7 percent. Limestone’s is 51 percent rather than 14 percent, Bacone’s is 72 percent rather than 15 percent, Texas Wesleyan’s is 42 percent, not 19 percent, and the University of Science and Arts in Oklahoma’s is 36 percent rather than 13 percent. Wiley College is listed in IPEDS as being open enrollment. That’s quite an accomplishment -- an open-enrollment institution with a 15 percent admit rate.

There are two outliers among the outliers, both of whom share an interesting characteristic. Ottawa University in Kansas actually shows up on the U.S. News Top 100 list twice, once at 14 percent and once at 24 percent. Ottawa has an online component as well as satellite campuses in Overland Park, Kan.; Milwaukee; Phoenix; and Surprise, Ariz. The main campus reports an admit rate of 15 percent but a yield rate of 66 percent.

The College of the Ozarks in Point Lookout, Mo., a conservative Christian institution that brands itself as “Work Hard U,” has a similar interesting statistical anomaly. Its reported admit rate on IPEDS is 10 percent, actually lower than credited by U.S. News, but it also reports a yield of 91 percent. I’m by no means a statistical expert, but that extremely low admit rate and extremely high yield rate suggest they have a different kind of admissions process than most other institutions.

I contacted U.S. News to see if there is an explanation for the discrepancies. A spokesperson responded by pointing out that “acceptance rate is not part of the methodology,” and also added this note about U.S. News’ approach to quality assurance.

“For quality assurance, rankings data that schools reported to U.S. News were algorithmically compared with previous years’ submissions to flag large change statistical outliers. Respondents were required to review, possibly revise and verify any flagged data to submit their surveys. For the third year in a row, they were also instructed to have a top academic official sign off on the accuracy of the data. Schools that declined to do this step could still be ranked but display a footnote on their U.S. News profile on usnews.com. After submitting, U.S. News assessed the veracity of data submitted on a factor-by-factor level and contacted select schools to confirm or revise data. Schools that did not respond or were unable to confirm their data’s accuracy may have had the data in question unpublished and unused in the calculations.”

If I am reading that correctly, U.S. News uses an algorithm that flags large changes in data from year to year, and then has institutions revise data as needed. But what about data that don’t change dramatically? Does U.S. News attempt to verify all the information submitted (which would obviously be a huge job) or does it operate on an honor system, trusting that institutions will answer truthfully?

The larger issue here is not whether acceptance rate is part of the ranking methodology, but why the U.S. News data don’t match the IPEDS data. Are the admit data an anomaly, or are there other questionable data U.S. News uses in its ranking methodology? Where is the line between trust and verification? And should we trust rankings based on data that is self-reported and unverified?

Editor’s Note: Inside Higher Ed reached out to U.S. News for comment about this column, and Robert Morse, who leads the rankings project at the magazine, replied via email, “When schools submit data to U.S. News, they are instructed to have a top academic official sign off on the accuracy of the data.”

Next Story

Written By

Found In

More from Views