technology September 2012

Burning Question

Why are wildfires defying long-standing computer models?
More
Marc Piscotty/Reuters

Clint Dawson’s bloodshot eyes evince his 14th straight day at the High Park fire’s Inci­dent Command Post, in Fort Collins, Colorado. It is late June, and the fire has already charred 70,000 acres. Dawson’s job is to guess what it will do next. As a fire-behavior analyst, or FBAN, he runs modeling software that predicts where a fire might be headed. When fires behave themselves, such models work well. But wildfires are getting bigger: their average size has tripled since the 1980s. And bigger fires are more complex than smaller ones, presenting more challenges for forecasting software. “We are definitely tweaking our models more on this fire than usual,” Dawson tells me.

Since the 1970s, modeling programs such as Farsite, FlamMap, and FSPro have become an essential part of fighting wildfires. The models, which are calibrated against how past fires have typically progressed, consider vegetation type; to­pog­raphy (flames prefer to travel uphill); a fire’s perimeter; and air temperature, wind, and humidity. They then predict where a fire will go, and when.

The problem is that nowadays, wildfires are increasingly atypical. For one thing, wildlands aren’t what they once were, thanks in part to climate change and encroaching development. For example, Dawson’s model doesn’t factor in mountain pine beetles. Milder temperatures have led to a beetle onslaught throughout the West, leaving trees desiccated and highly flammable. More generally, extreme weather—for example, droughts that leave forests dry as tinder—­means more-extreme fires. What’s more, the historical data informing the models are often many years old. The data omit recent landscape changes that radically alter a fire’s dynamics. Nor are the models sophisticated enough to factor in the presence of today’s slow-burning multi-thousand-square-foot exurban homes, which can smolder like charcoal briquettes and ignite neighboring structures.

Suppression of smaller wildfires over the past century has changed forests in fundamental ways. “We’ve been so success­ful at excluding fire that forests are nearly continuous,” says Mark Finney, a researcher at the Missoula Fire Sciences Laboratory, who develops modeling software. Previously, when most fires were allowed to burn, wildlands were a patchwork of burned and non-burned areas. Without these natural breaks, fires can now grow much larger than they used to.

All of these factors combine to spawn what FBANs call “extreme fire behavior.” Behind Dawson, I see orange smoke mushrooming from a ridgeline high into the sky. Dawson tells me that this indicates a “megafire”—a capricious, un­tamable beast that frustrates FBANs. (Colorado saw two megafires in the first half of 2012 alone.) A megafire can create severe weather of its own, befuddling models. Gusty outbursts blow counter to prevailing winds, goading flames downhill when the models predict an upslope burn. Blistering heat flash-dries foliage. In High Park, Dawson told me, “we’ve got mixed conifer up there, but in places it’s burning fast, like chaparral.” Timber stands that models say will burn slowly erupt as if doused with kerosene. Rob Seli, also based at the Missoula lab, explains that many megafires are plume-dominated. And a plume-dominated fire, he says, is “like an atom bomb going off. It can expand rapidly, in any direction. It’s the same thing that happens in a thunderstorm. And models can’t account for this behavior.”

Some years from now, improved computing power will surely catch up to today’s fires, yielding models that crunch more variables with more elaborate characteristics—the flammability of different building materials, say, or the complex atmospheric physics involved in plume-dominated fires. (Even the fastest supercomputers we have now would take days to do this.)

In the meantime, some experts worry that younger fire analysts lean a bit too heavily on their data-crunching skills, and have little field experience. Dawson is thankful to have spent his early career fighting fires with an ax and a shovel. While working the High Park fire, he trekked into the field every morning to supplement his digital prognosis with some analog intuition. Tim Sexton, who is a strategic planner from the National Interagency Fire Center and worked alongside Dawson, also made a point of visiting the blaze. “The model gives you a place to start. But then go out and look at the fire,” Sexton says. “Because in nature, nothing is ever exact.”

Michael Behar is a writer in Boulder, Colorado.
Jump to comments
Presented by
Get Today's Top Stories in Your Inbox (preview)

Sad Desk Lunch: Is This How You Want to Die?

How to avoid working through lunch, and diseases related to social isolation.


Elsewhere on the web

Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register. blog comments powered by Disqus

Video

Where Time Comes From

The clocks that coordinate your cellphone, GPS, and more

Video

Computer Vision Syndrome and You

Save your eyes. Take breaks.

Video

What Happens in 60 Seconds

Quantifying human activity around the world

Writers

Up
Down

More in Technology

More back issues, Sept 1995 to present.

Just In