The End of Paper-and-Pencil Exams?

In a recent pilot program, kids as young as nine were asked to respond to online prompts and type out essays on a computer. 
More

Are fourth graders computer-savvy enough to have their writing skills measured in an online assessment? A new federal study suggests that they are, although it’s not clear whether old-fashioned paper and pencil exams might still yield useful results.

The National Assessment of Educational Progress, sometimes referred to as “The Nation’s Report Card,” is administered every two years to a representative sampling of students in grades 4, 8, and 12. Because each state uses its own mix of assessments, NAEP (along with the SAT and ACT college entrance exams) is one of the few ways of making comparisons nationally on student performance. NAEP expects to be a fully online assessment by 2017.

In preparation for that shift, the National Center on Education Statistics tested an early version of an online writing assessment on 60 fourth graders to gauge their comfort level with the platform and design. After identifying common issues that students were struggling with, the pilot online writing assessment was eventually redesigned. Among the changes: swapping out a drop-down menu for icons, and giving students more frequent—and shorter—prompts. That assessment was then administered to 13,000 fourth graders. 

Students were asked to respond to prompts that required one of three modes of writing:

  • Persuade: Write a letter to your principal, giving reasons and examples why a particular school mascot should be chosen.
  • Explain: Write in a way that will help the reader understand what lunchtime is like during the school day.
  • Convey: While you were asleep, you were somehow transported to a sidewalk underneath the Eiffel Tower. Write what happens when you wake up there. 

Because it was not a representative sampling, NCES has cautioned against drawing conclusions about the online assessment, or the students’ performance.

With that caveat, here are a few of the findings from the pilot study:

  • On a scale of 1 to 6, just over 60 percent of students scored a 3 or higher on the assessment. NAEP officials say that means “the majority of students wrote enough to be assessed, included ideas that were mostly on topic and used simple organizational strategies in most of their writing.”
  • Students scored higher when they had more time30 minutes vs. 20 minutesto construct their responses.
  • Nearly all students92 percentsaid they had previously taken some form of an online assessment.

While this may only have been a pilot study, it’s clear that many of the students who took part struggled to respond to these prompts. When students were given 30 minutes to work on the assignment, 26 percent were rated a 2 (marginal) while 14 percent were at 1 (little or no skill). At the other end of the spectrum, just 10 percent of students were rated a 5 for “competent,” while 4 percent earned the highest possible score of 6 for “effective.” 

The pilot assessment was scored by human beings, not computers, said Cornelia Orr, executive director of the National Assessment Governing Board, which oversees NAEP. The switch to an online assessment gives NAGB the opportunity to develop a more engaging testing platform that holds students’ attention, Orr said during a webinar to discuss the findings.

“When students are interested in what they’re writing about, they’re better able to sustain their level of effort, and they perform better,” Orr said.

One downside to the NCES pilot study: It doesn’t compare student answers with similar questions answered in a traditional written exam setting. From Education Week’s overview:

Scott Marion, the associate director of the Dover, N.H.-based National Center for the Improvement of Educational Assessment, said he’d like to see a study in which students are each given two writing prompts, responding to one on paper and one on computer. The one written on paper would then be scribed onto the computer for true comparability (since, he also noted that, in general, typed papers tend to receive higher scores than handwritten ones).

There is another concern: How well classroom instruction will line up with the new expectations of NAEP’s online writing assessment. Certainly a central goal of the Common Core is deeper knowledge, where students are able to draw conclusions and craft analysis, rather than simply memorize rote fact. 

But teaching students to write is a complex animal all its own, said Peg Tyre, author of “The Writing Revolution,” a 2012 story for The Atlantic, that detailed the debate in education circles over methodology and expectations. The foundation of strong writing is knowing how to “work with language and build strong sentences into paragraphs,” said Tyre, who is the director of strategy at the Edwin Gould Foundation. For those skills to become rote, writing has to be a regular part of the student’s experience. Tyre said she’s not seeing that happening in most of the public school classroom she visits.

“Teachers aren’t taught how to teach writing,” Tyre told EWA. “While it’s not rocket science, it does require a dedicated approach and a dedicated amount of time.”

To be sure, NAEP isn’t alone in abandoning paper exams. Two testing consortia that have designed assessments aligned to the new Common Core State Standards  – Smarter Balanced and thePartnership for the Assessment of Readiness for College and Career  – use online assessments, which are being rolled out through 2015. (In some places, it’s been a bumpy start.) The GED high school equivalency exam has also gone digital.

Jacqueline King, director of higher education policy for Smarter Balanced, said the fourth-grade assessment includes components to measure students’ writing abilities. It will be left up to individual states to decide how to score the exams, within some parameters established by the consortium to ensure “accuracy and comparability of student scores,” King said.

The field test of the assessment was not machine-scored, King told EWA. However, the consortium is investigating whether, some elements of the student responses, such as their spelling and syntax, could be graded using a software program. The longer answers – where students are asked to write informative, narrative, or opinion pieces – take 90 minutes to complete, which includes time for students to read materials and craft their responses, King said. (That’s three times as long as students are given for the NAEP pilot study assessment.)

While there are digital programs designed to measure student writing, “We have determined that those programs are not yet able to adequately score all the elements of student writing that we are asking our human scorers to evaluate,” King said. “We will continue to watch that technology as it evolves.”


This post also appears at The Educated Reporter, an Atlantic partner site.

Jump to comments
Presented by

Emily Richmond is the public editor for the National Education Writers Association. She was previously the education reporter for the Las Vegas Sun.

Get Today's Top Stories in Your Inbox (preview)

A Technicolor Time-Lapse of Alaska's Northern Lights

The beauty of aurora borealis, as seen from America's last frontier


Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register. blog comments powered by Disqus

Video

A Time-Lapse of Alaska's Northern Lights

The beauty of aurora borealis, as seen from America's last frontier

Video

What Do You Wish You Learned in College?

Ivy League academics reveal their undergrad regrets

Video

Famous Movies, Reimagined

From Apocalypse Now to The Lord of the Rings, this clever video puts a new spin on Hollywood's greatest hits.

Video

What Is a City?

Cities are like nothing else on Earth.

Video

CrossFit Versus Yoga: Choose a Side

How a workout becomes a social identity

Video

In Online Dating, Everyone's a Little Bit Racist

The co-founder of OKCupid shares findings from his analysis of millions of users' data.

Writers

Up
Down

More in

Just In