American eighth-graders continue to demonstrate lackluster knowledge and skills when asked basic questions about U.S. history, geography, and civics. New data shows that only between 18 and 27 percent of students scored "proficient" or "higher."
The results from the National Assessment of Educational Progress (NAEP) is based on a representative sampling of more than 29,000 U.S. eighth-graders tested last year across the three subjects. (In history, for instance, it tested more than 11,200 students.)
Since 1998, scores have been inching upward in several topic areas, particularly for minority and low-income students. But overall there were no meaningful gains in student performance since the combined history, geography, and civics test was last given in 2010.
While scores overall have stagnated, student groups with historically low performance continue to inch up, and at a faster rate than their white peers with particularly rapid improvement by Hispanic kids.
The NAEP sample tests give a good idea of what students are being asked to do—it’s a lot more than just multiple choice questions. They must also interpret graphs and data, and demonstrate the kind of critical thinking skills that the Common Core State Standards are intended to promote.
The NAEP is just one indicator of student knowledge and skills, and it’s not designed to evaluate the merits of a particular educational program or intervention. Stephen Sawchuk of Education Week uses the term “MisNAEPery” to refer to the many examples of education advocates and policymakers using data from the assessments to whet their own particular axes.
(Jessica Brown’s thoughtful overview of the full report’s findings, and this fact sheet give a good overview on what NAEP does—and doesn’t—measure.)
That said, some experts note that one powerful use of NAEP is to offer an independent barometer to compare with results on state tests, especially when it comes to reading and mathematics. In other words, if state test results show 90 percent of students are proficient in a given subject but NAEP says it’s 25 percent, that may be a red flag that the state’s bar for “proficient” is not set very high. (The NAEP exam for history, civics, and geography does not report out state-by-state results, however.)
In any case, there are some interesting long-term trends across multiple NAEP subject areas. Peggy Carr, the acting commissioner of the National Center for Education Statistics (which oversees NAEP) noted during Tuesday’s press call that U.S. students have continually made gains on the reading assessment, particularly among those in the lowest-performing quartile. She theorized that those gains in literacy might have helped them when it came time to take the latest history and civics test.
If students are more confident readers, the history and civics questions might be less daunting “and they might be able to access these materials better than they were in past years,” Carr said. (The education historian Diane Ravitch made a similar argument when the high school NAEP history results—also lackluster—were released in 2011, saying it gains by lower achievers potentially reflected improved literacy rather than a deeper grasp of the content.)
And simply adding more class hours would not solve the problem. As Carr told me: “There’s no association between how much time is spent on a subject in a given year and how well students perform.”
Given that NAEP scores on history, civics, and geography have been sluggish since 1998, today’s report is no surprise. A 2011 story by NPR contended that American students have always been weak in the subject, providing a laundry list of banner headlines lamenting poor history scores all the way back to 1955. And despite that, the United States continues to lead the world in key areas, Ravitch said in the NPR interview.
“We have to temper our alarm,”Diane Ravitch told All Things Considered host Laura Sullivan in the 2011 interview. “And realize we’re not a very historically minded country.”
To be sure, ensuring an engaged and informed citizenry was one of the original arguments made by the Founding Fathers who advocated for free public schools. There’s been a groundswell in recent years to make civics education a national priority, with several states increasing the curriculum requirements for K-12 students (The Wall Street Journal put together a handy chart). Robert Pondiscio, now with the conservative-leaning Thomas B. Fordham Institute, argued in 2013 that passing the U.S. citizenship test should be a requirement for high-school graduation. In January, Arizona became the first state to take that step.
It’s worth remembering that the NAEP is what’s known as a low-stakes assessment—no teachers’ or principals’ jobs are on the line if kids don’t do well, and the students themselves often realize the outcomes have no bearing on their own academic trajectories.
Officials at the National Assessment Governing Board, which oversees NAEP, have said they’re taking a look at the impact of student motivation—or lack thereof—on student test scores. One study, by a Boston College researcher, found that offering high-school students relatively low-level incentives to “try” harder on a reading test similar to NAEP produced meaningful gains. But raising the stakes for NAEP could hurt its usefulness as a long-term barometer of student knowledge.
As Jack Buckley, the former NCES commissioner (and now with the College Board) told me in 2014: “How much pressure is going to be put on the data, and for what purpose? You need just enough pressure not to distort the outcome—that’s the ideal measure.”
This post appears courtesy of The Educated Reporter.