The list of complaints about the statewide standardized exams that most states have adopted as high school accountability measures is long: professors teach to the test, the standards are pegged to the lowest common denominator, etc. But a new study suggests that a new one might be added in some states: contributing to grade inflation for college-bound students who do well on the tests.
And that finding, if borne out, could complicate the already significant problems of college admissions officers trying to decide among many seemingly highly qualified candidates.
The working paper , which was written by George Mason University's Patrick D. Marquardt and published on the Social Science Research Network, examines the impact that Virginia's Standards of Learning  -- and particularly changes that the state made to encourage high school students to take the test seriously -- had on the average high school grade point average of students who attended Virginia's public colleges.
Virginia implemented its statewide high school test in 1998, but after many schools' students fared poorly on the high-stakes exam in its first years, the state, hoping to encourage more students to take it seriously, required all students to pass a certain number of SOL exams to graduate. Marquardt's paper, though, focuses on changes that school districts quietly made to encourage student participation, often involving grade-based incentives. Some, Marquardt says; among the most extreme, gave students who passed an exam an uptick (from B to B+, say), while others let students use the SOL in a particular subject as their final exam, earning an A if they passed it.
Marquardt's thesis is that those changes drove up the high school GPAs of Virginia's college-going students, and to examine it, he compared data on the academic credentials of freshmen at Virginia's public colleges with about those from 38 similar public institutions in 28 states that had not adopted "end of course" standardized examinations like Virginia's SOL.
The data show a much sharper rise in Virginia than elsewhere. From 1995 to 2007, the average high school GPA of enrolled freshmen in the commonwealth rose from 3.27 to 3.56, an increase of 9.9 percent, or 0.79 percent a year. In the national sample, by comparison, the average GPA grew to 3.49 from 3.28, a rise of 6.4 percent or 0.5 percent a year.
Marquardt explores other possible theories for the increased GPAs. He notes that students in the national sample showed significant upticks in their grades, too, perhaps attributable to "increased pressure to attend college from students’ parents, teachers, and peers, the growing wage differential between high school and college graduates, and the growth in the population of college-bound students compared to the limited supply of college seats."
And in Virginia, he notes, more school districts now reward students who take AP and honors classes with half letter grade increases, and some have shifted to a 10-point grading scale (including pluses and minuses) from a 6-point scale. But those changes unfolded at various points throughout the 12-year period, and are unlikely to explain the sharp and "discontinuous" jumps that occurred in 2000 and 2004, the two points at which Virginia ramped up the stakes for schools based on their performance on the SOL.
One other possible factor that could explain why students at Virginia's universities showed a much sharper rise in high school GPA over this period would be if more of the commonwealth's best students had opted to stay close to home for college. If that were the case, though, Marquardt argues, their SAT scores would have risen along with the GPAs, and the data he collected find otherwise. While Virginia's students scored 17 points higher on average than the national students in his sample during the 1995 to 1999 period, their score averaged 32 points below the national sample during the 2004-7 period.
"The separation over time between GPAs and SAT scores of Virginia university students as compared to the national trend suggests that in-state enrollment of Virginia’s top high school students has not significantly impacted the average high school GPA statistic of universities in Virginia," Marquardt writes.
(While his study did not look at individual colleges in Virginia, a review of the Common Data Set statistics at James Madison University, a public institution in Virginia, appears to lend some credence to his thesis. The proportion of Madison's full-time freshmen in 2008-9 with a high school GPA of 3.5 or above was 71 percent, compared to 67.8 percent in 2005-6 , the latest year for which data are available in that format on the university's Web site. SAT scores, meanwhile, held constant over that time. And data from 1998  show that 87.2 percent of freshmen had GPAs of 3.0 or better that year, a figure that had risen to 98 percent by 2008. Again, SAT scores held constant.)
What are the implications of Marquardt's findings? "By encouraging all students to take the SOL exams seriously, steps taken by educators to increase grades for students to participate fully in the SOL examination process may have distorted high school GPAs as a signaling device for college admission," he writes.
"In raising stronger students’ course grades for passing exams meant to improve underperforming students, educators may have made the college in-state admission process more competitive at the state, district, and school level. As Virginia students competed for in-state college admission, some students may have been advantaged (disadvantaged) by increasing (not changing) GPAs as a result of school district policy related to the use of SOL results in students’ final course grades."
Although he doesn't note it, Virginia's policy could be seen as having national implications, too, especially for Virginians who applied to colleges elsewhere that emphasized high school grade point average.