- Era Ends for 3 Subject Test Requirements
- SAT Skepticism in New Form
- At Pitzer College, most students don't submit test scores
- SAT Math Scores Are Up
- The New SAT: Longer, but No Better?
- 'Uneducated Guesses'
- New research finds SAT is equally predictive for those with high and low socioeconomic status
- Coaching and Lasting Out New SAT
The Evolving SAT Debates
The College Board is changing a longstanding policy and, starting for the high school class 2010, will allow students who take the SAT multiple times to decide how many and which scores will be visible to colleges. Current policy requires that all scores be reported.
Because wealthier students are most likely to use coaching services and to take the test multiple times, the existing policy has been viewed by many educators as something of a limit on the advantage that the SAT gives to wealthy students, since an admissions officer would at least be aware of the likelihood of coaching.
The change in policy comes at a time of heightened scrutiny for the SAT, which more and more colleges are dropping as an admissions requirement and which has just undergone a significant overhaul, with the addition of a writing test, among other changes. Just last week, the College Board released new "validity" studies that board leaders hailed as a great success but that showed no meaningful improvement in the reliability of the test in predicting the success of students as freshmen.
Three economists at the University of Georgia, in a project not supported by the College Board, were last week putting the finishing touches on a research paper about the new SAT and its predictive value at their institution.
Because of the College Board's announcement, the Georgia scholars decided to make their data available now -- and there are figures to encourage both the College Board and its critics. Like the College Board, the economists found substantial evidence that doing better on the new SAT writing test does correlate with success in the first year of college. But the economists also found that now that the SAT writing section has been added, the verbal portion of the SAT adds no predictive value -- a finding that could resonate with those who believe the SAT has become too lengthy and expensive.
The College Board didn't issue a broad announcement on the change in policy on repeat test takers, but told the Los Angeles Times about the shift.
Laurence Bunin, senior vice president of the SAT, told the newspaper: "Students were telling us [that] the ability to have more control over their scores would make the test experience more comfortable and less stressful.... We can do that without in any way diminishing the value and integrity of the SAT."
About 15 percent of students take the SAT three or more times -- and while the College Board has said serial exam taking isn't effective, many who pay big bucks for tutoring say they see gains. The College Board waives its $45 fee for taking the test for low-income students, but they are only eligible to take the test twice without paying. There is no limit on those who can afford to pay, and the embarrassment factor of having a college see numerous scores will now be wiped out.
One restriction that will remain is that students must submit all their scores from the same test administration, so they can't mix a mathematics score from one date with a verbal score from another.
In changing its policy, the SAT is matching the approach of the ACT, the top competitor to the SAT and a test that is seeing increasing use.
Robert Schaeffer, public education director of the National Center for Fair & Open Testing, via e-mail called the shift "yet another marketing move by the increasingly desperate College Board responding to the growing perception among high school students that the ACT is a more consumer-friendly test (e.g. score choice, no guessing penalty, optional essay section, less time, less money, more like the exams they take in high school)."
Study Backs Writing Test, Not Verbal
The University of Georgia study, like the College Board study released last week, was based on the first class of students who enrolled having taken the new writing test and who have now completed a full year of courses. (The College Board has always tied SAT scores to first-year performance, but not to achievement after that.)
While the College Board used a national sample, the Georgia study used a large sample (4,300 students) at one university -- one with competitive admissions and diversity in its student body. Many of the findings are consistent with the claims of the College Board about the predictive value of the new writing test. The Georgia researchers found that each 100 point increase in writing score correlated with:
- A 0.07 point increase in freshman year grade-point average.
- A 0.18 increase in freshman year English courses.
- A 0.44 average increase in credit hours enrolled.
- A 0.54 average increase in credits earned.
Also like the College Board, the Georgia researchers said that they found the new writing test to be more predictive than the math or verbal section. But unlike the College Board, the Georgia researchers went much further: They compared the predictive value of the rest of the SAT now that writing has been added, and they find that nothing is added by the verbal test any more. All the predictive value of the verbal test has been subsumed by the writing test.
David B. Mustard, one of the Georgia economists who wrote the paper (with Christopher M. Cornwell and Jessica Van Parys), said that the study there controlled for factors like gender, race, credit hours taken and so forth. "And when you use the writing, there's no added value to the SAT verbal, no marginal benefit or extra predictive validity."
A College Board spokeswoman, after being sent the research, said that because officials there hadn't seen it previously, she could not comment on it.
Search for Jobs