The Washington PostDemocracy Dies in Darkness

What’s a college test score worth? An ACT-vs.-SAT dispute.

May 12, 2016 at 2:31 p.m. EDT
(Michael Quirk/iStock)

Now they’re pulling out the really sharp pencils.

The ACT and the College Board, overseers of the nation’s two rival college admission tests, are exchanging unusually pointed words this week over an effort to help students interpret scores on the revised SAT.

First, the College Board released a set of charts Monday that showed how new SAT scores compare with the old SAT scores and ACT scores. It turns out that the new SAT scores are somewhat higher on the 1600-point scale than comparable results from the old version of the test. A new SAT 1300, for instance, corresponds to a 1230 on the sections of math and critical reading in the old version of the test.

That meant families familiar with the old SAT scale would have to adjust their reflexive interpretations of what a score is worth. College admissions testing, after all, is an exercise in sorting. It is crucial not to overestimate or underestimate the relative strength of a given score, especially with an entirely new version of a test.

Why your new SAT score is not as strong as you think it is

The College Board, which owns the SAT and is based in New York, also asserted that a new SAT 1300 corresponds to 27 out of a maximum 36 on the ACT.

Counselors, students and parents across the country were puzzling through these complexities when the ACT suddenly weighed in Wednesday.

Marten Roorda, chief executive of ACT, which is based in Iowa, challenged the College Board’s analysis. Here’s what he wrote:

May 11, 2016
By Marten Roorda, Chief Executive Officer, ACT
Here’s an SAT word for you: equipercentile.
Even though the College Board promised to get rid of “SAT words” on the latest version of its test, if you want to understand your new SAT scores, you’d better know what “equipercentile” means.
Let’s back up a bit to see why.
The College Board just completed the overhaul of the SAT. The new test has been administered to students on two national test dates, most recently on May 7, 2016.
The trouble for students, schools, and colleges is that it’s difficult to compare scores from the old SAT to the new SAT. If you’re asking different questions using different rules and different scoring scales, how can you compare an old SAT score from last fall with a new SAT score from this spring?
The answer is: You need sophisticated statistics. This is where “equipercentile” comes in. In short, using the College Board’s own explanation, if 75 percent of students achieve a score of X on Test A and 75 percent achieve a score of Y on Test B, then the scores X and Y are considered “concorded.”
In fact, the College Board recently has been promoting its new “SAT Score Converter,” which, it says, allows you to compare scores on the new SAT with the old SAT and with the ACT® test. However, this mathematical makeover comes with several caveats the College Board didn’t tell you about.
For example, after past SAT revisions, such as that from 2006, concordance tables were created after more than a year’s worth of data were in. One reason for this is that students who test in the fall are more likely to be seniors than those who test in the spring. Moreover, students willing to take the first iteration of a test that has undergone a major overhaul are likely quite different from the typical student.
Therefore, to get a full-and-fair sample, it’s important to get at least a full year’s worth of data to compare. With data from only the March SAT available, it’s clear that the current sample stands a significant chance of being different from the whole.
In 2006, the College Board did wait for actual results to come in—results that changed the concordance calculations. Now, not only is the College Board not waiting to make pronouncements about its own tests, it’s asserting the concordance with the ACT — which is why we have skin in the game.
To arrive at the ACT concordance, the College Board appears to have used a technique called “chained concordance,” which makes links between the new SAT and the old SAT, and then from the old SAT to the ACT. It therefore claims to be able to interpret scores from the revamped SAT relative to the tried-and-true ACT.
Speaking for ACT, we’re not having it. And neither should you.
A lot has changed in education since 2006. Linking scores from a single administration of the new SAT to the old SAT, and then to the 2006 ACT, is a bridge too far.
In 2006, the College Board and ACT worked collaboratively under the aegis of the NCAA to produce the official ACT-SAT concordance table. That work represented the gold standard in concordance, and it remains the only concordance ACT recognizes.
Now, without collaborating with ACT, the College Board has taken it upon itself not only to describe what its scores mean, but what ACT’s scores mean. That’s different from 10 years ago, and different from the standard you should expect from a standardized testing agency.
Meaningful concordance is difficult to achieve, particularly when you have tests that are so different — not only the new SAT from the old SAT, but both SATs relative to the ACT, which, for example, continues to have a science test that the SAT lacks.
ACT cannot support or defend the use of any concordance produced by the College Board without our collaboration or the involvement of independent groups, and we strongly recommend against basing significant decisions — in admissions, course placement, accountability, and scholarships—on such an interim table. Those decisions require evidence and precision far beyond what has been offered to date.
ACT remains eager to engage the higher education community in conducting a rigorous concordance between scores on the ACT and the new SAT — when the data are available. That will be in about a year.
Until then, we urge you not to use the SAT Score Converter. And not to listen to messages suggesting the old SAT and the new SAT, or even the ACT, are comparable.
For me that’s unequivocal, to use another SAT word.

There was more than a bit of snark in Roorda’s critique, as he appeared to be mocking the College Board’s effort to simplify vocabulary in the new version of the SAT. But beyond the snark, he was seeking to make a point about the technicalities of testing analysis.

In general, students, parents and schools trust the ACT and the College Board to get the technicalities right. Here, Roorda was suggesting that the College Board had failed to do that.

Which on Thursday drew a sharp rebuttal from the College Board’s senior vice president for research, Jack Buckley. He denied that the College Board cut any corners. Here is what Buckley said in an email to The Post:

In his letter of 11 May, ACT CEO Marten Roorda calls into question the integrity of the new SAT concordance tables. It’s vital to students and educators that we correct this misinformation.
Mr. Roorda falsely claims that the College Board derived our new concordance tables based on the March 2016 administrations of the new SAT. In fact, we conducted two large-scale national concordance studies in December 2014 and December 2015. This approach exceeds industry standards. We stand behind both the methodology and the results of these studies.
We conducted these concordance studies in advance of the first administrations of the new SAT because we clearly heard from our members at colleges and universities across the country that they would need this information to make admissions decisions this year.
Furthermore, several months ago we reached out to ACT to express our strong interest in conducting a new SAT-ACT concordance study to update the existing decade-old study. We look forward to working with ACT in a renewed spirit of cooperation. We owe it to the students and education professionals that both of our organizations serve.
Jack Buckley

Some context about this exchange: The ACT, launched in 1959, had for decades smaller market share than the SAT, which debuted in 1926. The ACT has long billed itself as an achievement test in four sections: English, reading, math and science, plus an optional essay.

In 2012, the ACT overtook the SAT and became the nation’s most widely used test. Many years ago, the SAT stopped depicting itself as an aptitude test. In its latest version, it is described definitively as an achievement test in math and “evidence-based reading and writing,” also with an optional essay. The College Board also dropped some other features of the SAT from years past, including the notorious guessing penalty. Now scores are calculated based only on a student’s correct answers without deductions for errors, a straightforward method that has long been the ACT’s approach.

Both the College Board and the ACT are competing intensely for state contracts to deliver testing services in public schools. Their rivalry no doubt explains part of this week’s discord over concordance. For students who aren’t tracking this inside game, the bottom-line question remains the same: What is my test score worth?

Read more: 

Meet the man behind the new SAT

The SAT, now the No. 2 college test, pushes to reclaim supremacy

ACT essay scores are inexplicably low, causing uproar among students