In an early glimpse of how much tougher state tests could be in the Common Core era, a new federal report released in July shows that early adopters of the controversial standards are assessing their students using far higher bars of difficulty.
While this new report is unlikely to settle the battle between Common Core advocates and foes, it does indicate that one of the original purposes of the standards—challenging students in math and reading more so they’ll be better prepared for the rigors of college and their careers—seems to be proving fruitful. But tougher tests aren’t contingent on adopting the Common Core: Texas, one of the few states that has eschewed the standards, is also among the few states using tests that are much more challenging.
The Common Core adopters Kentucky, New York, and North Carolina joined Texas in offering tough math and English tests to its fourth- and eighth-graders.
Though the report, released by the National Center on Education Statistics, contains many moving parts, its basic premise is that because states set their own rules for testing difficulty, those rules—or “cut scores”—should be compared to a common yardstick. The U.S. has such a benchmark: The National Assessment of Educational Progress, which is considered the gold standard in measuring how much students know and viewed by many if not most education experts as a more exhaustive assessment than anything the states offer.
Similar to state assessments, NAEP assigns various levels of achievement, including below basic, basic, proficient, and advanced. The federal report released this month aligns state benchmarks for proficiency with a common national scale using those NAEP scores.
While states have demonstrated some improvement over time, measuring student progress by using higher-caliber test questions, they have a long way to go. The report shows that what’s considered proficient in 25 states translates to “below basic” on the NAEP for fourth-grade reading. Only two states’ “proficient” levels—New York and Wisconsin—matched NAEP’s definition of “proficient” in fourth-grade reading. In eighth-grade reading, just New York could say the same.
During a press call with reporters, the acting commissioner of the education-statistics center, Peggy Carr, explained that the national board that sets the “proficient” level on the NAEP considers that score an indication that students are college-ready. By that measure, only a handful of states call on their students to reach levels of academic prowess that line up with NAEP’s definition for “college readiness.”
Five states had fourth-grade math proficient levels that match NAEP’s while four states’ proficient levels placed below NAEP’s basic cut score. The charts below summarize the rest of the findings.
States’ Fourth-Grade Reading “Proficiency” Levels
States’ Eigth-Grade Reading “Proficiency” Levels
States’ Fourth-Grade Math “Proficiency” Levels
States’ Eigth-Grade Math “Proficiency” Levels
The U.S. Secretary of Education Arne Duncan said in a press release that “coupled with the fact that more than 40 states are moving forward with new, higher academic standards that the states themselves developed, this is encouraging news for parents and students.”
To Gary Phillips, the American Institutes for Research vice president, the varying definition of “proficient” across all states signals that many are “living in a Lake Wobegon fantasy where they say the students are above average when they’re not.”
Phillips says that the range in proficiency levels is the equivalent of “about three to four grade levels in student performance. The rigor of the grade-four standards in the highest achieving states may be comparable to the rigor of the eighth-grade standards in the lowest achieving states.”
Because states largely set their own definitions for proficiency, two states with the same percentage of students showing strong results can mask sizable differences in achievement. While 71 percent of students in both Arizona and Kentucky earned state assessment scores that fell within their states’ definition of proficient, Kentucky students in that category would have an average NAEP score of 252, while in Arizona those students would have a comparable score of 243—a statistically significant difference.
Since 2013 roughly half of all states agreed to use assessments aligned with the Common Core created by two consortia—Smarter Balanced and The Partnership for Assessment of Readiness for College and Careers. For those states and others that align their tests with the Common Core, tougher cut scores are likely on the way.
“I will predict that the consortia states will find that their standards are more rigorous than what they may have had previous to the consortia assessments,” said Louis Fabrizio, who heads data, research, and federal policy at the North Carolina Department of Public Instruction.
He notes that though North Carolina didn’t roll out tests designed by the consortia, the state for the first time administered assessments based on the Common Core state standards in the 2012-13 school year—just in time to be noticed by this month’s federal report. As one of the few states at or near the NAEP proficiency standard, “the results speak for themselves,” Fabrizio said. The last time NCES issued such a report, evaluating 2011 state tests, it found that North Carolina had some of the weakest assessments standards in the country.
While Kentucky and New York also adopted Common Core-aligned assessments between 2011 and 2013, paving way for their high marks in the new report, Texas had no part in the Common Core. Still, the state transitioned to a new set of tests, called STAAR, that were touted as much more difficult than its predecessor and aligned with its own independent academic standards. A 2014 Dallas Morning News article noted that on average students weren’t improving on the assessment since its debut in 2012. The NCES report helps explain why: Between 2011 and 2013, Texas’s proficient benchmark soared from the near bottom to among the top few in the country.
“We would encourage states to adopt Common Core,” said Scott Norton, who heads assessments and accountability for the Council of Chief State School Officers. “But if they don’t want to do that and adopt some other set of college and career ready standards, that’s good, too. Texas is a great example of that. This report bears that out.”
Wisconsin, another state that demonstrated major strides in its state exam rigor between 2011 and 2013, toughened its cut scores in time for the 2012-13 state tests. John Johnson, the head of communication at the Wisconsin Department of Public Instruction, said that state education leaders sent letters to parents explaining the changes, which also warned them that student scores may drop due to the increased difficulty.
Still, no matter how much more work is poured into strengthening the tests, Norton cautions that tougher assessment benchmarks won’t on their own lift student scores. “The first part is adopting more challenging career-ready standards,” he said. “Then it’s important to put in a test that’s aligned to those standards with rigorous achievement [levels] … when that happens, students can begin to perform better, and that’s probably what we’ll see over time.”
This post appears courtesy of the Education Writers Association.
We want to hear what you think about this article. Submit a letter to the editor or write to firstname.lastname@example.org.