Students were asked to respond to prompts that required one of three modes of writing:
- Persuade: Write a letter to your principal, giving reasons and examples why a particular school mascot should be chosen.
- Explain: Write in a way that will help the reader understand what lunchtime is like during the school day.
- Convey: While you were asleep, you were somehow transported to a sidewalk underneath the Eiffel Tower. Write what happens when you wake up there.
Because it was not a representative sampling, NCES has cautioned against drawing conclusions about the online assessment, or the students’ performance.
With that caveat, here are a few of the findings from the pilot study:
- On a scale of 1 to 6, just over 60 percent of students scored a 3 or higher on the assessment. NAEP officials say that means “the majority of students wrote enough to be assessed, included ideas that were mostly on topic and used simple organizational strategies in most of their writing.”
- Students scored higher when they had more time—30 minutes vs. 20 minutes—to construct their responses.
- Nearly all students—92 percent—said they had previously taken some form of an online assessment.
While this may only have been a pilot study, it’s clear that many of the students who took part struggled to respond to these prompts. When students were given 30 minutes to work on the assignment, 26 percent were rated a 2 (marginal) while 14 percent were at 1 (little or no skill). At the other end of the spectrum, just 10 percent of students were rated a 5 for “competent,” while 4 percent earned the highest possible score of 6 for “effective.”
The pilot assessment was scored by human beings, not computers, said Cornelia Orr, executive director of the National Assessment Governing Board, which oversees NAEP. The switch to an online assessment gives NAGB the opportunity to develop a more engaging testing platform that holds students’ attention, Orr said during a webinar to discuss the findings.
“When students are interested in what they’re writing about, they’re better able to sustain their level of effort, and they perform better,” Orr said.
One downside to the NCES pilot study: It doesn’t compare student answers with similar questions answered in a traditional written exam setting. From Education Week’s overview:
Scott Marion, the associate director of the Dover, N.H.-based National Center for the Improvement of Educational Assessment, said he’d like to see a study in which students are each given two writing prompts, responding to one on paper and one on computer. The one written on paper would then be scribed onto the computer for true comparability (since, he also noted that, in general, typed papers tend to receive higher scores than handwritten ones).
There is another concern: How well classroom instruction will line up with the new expectations of NAEP’s online writing assessment. Certainly a central goal of the Common Core is deeper knowledge, where students are able to draw conclusions and craft analysis, rather than simply memorize rote fact.