The West will have many more summers like this past one. Extreme heat waves, wind events, and droughts will make severe, destructive fires an inevitability. The air will be choked with smoke from July to October, and tens of thousands of people will likely be displaced by wildfires in the next decade. For all of humanity’s attempts at setting boundaries between our spaces and wild ones, every summer proves that the two are irreversibly interwoven.
The calls for better management of the lands where fire and human settlement meet are intensifying. Some western landscapes haven’t seen fire at regular, natural intervals since the 19th century, leaving them congested with overgrown brush and bug-ridden trees. Politicians are always happy to talk about “better forest management”—most recently, Mike Pence briefly mentioned it during the vice-presidential debate, as a solution to severe wildfires. Meanwhile, fire managers and ecologists continue to beg for resources to complete more forest-management projects. And politicians, national media, and fire managers alike are repeating one specific (and familiar) sentiment: The West Coast needs more prescribed burning, and it needs it right now.
Prescribed burns are often low-intensity, controlled fires set in the fall and spring, typically at the precipice of the rainy season, when temperatures are low and humidity is high. Under these conditions, fire can be easily contained by roads or fire lines built before the burning begins. This type of fire creeps along the ground, burning brush, grasses, leaves, and needles; it rarely reaches treetops and is profoundly beneficial—and seldom dangerous—if done right.
But even if land agencies and communities started widespread prescribed burning this winter, western landscapes would still need decades to recover from a century of full-suppression firefighting. No one has a good estimate of exactly how much land would need to burn to reduce the impact of modern wildfires, but the limits of modern prescribed burn practices mean that the West might need to seek another solution.
Most years, prescribed burns can touch roughly 3 million acres of land in the West, and in areas with robust prescribed-fire programs, they can create crucial buffers to protect communities. These burns happen, for the most part, on lands managed by the U.S. Forest Service and other federal agencies. However, practitioners on state, municipal, and other nonfederal lands have increased their prescribed-fire use by 13.4 percent since 1998—while USFS and Bureau of Land Management prescribed-fire initiatives have largely stagnated in the same time frame. Put simply: Federal agencies, which own nearly 50 percent of land in the West, haven’t done the work to keep that land fire-resilient.
Wildfires—whether started by humans or lightning—burn almost 7 million acres on average every year, and closer to 10 million during particularly bad fire seasons. To keep fires from growing at the extreme speeds we’ve seen this fire season, it’s not unreasonable to estimate that land managers at the federal, state, and local levels would need to burn dramatically more acreage every year, particularly in strategic areas that would help protect some of the most vulnerable, fire-prone communities. It’s hard to say exactly how many acres will make a difference, but the general consensus is simple: many more than are currently being burned in the West.
Two of the basic problems with the current regime of prescribed fire are resources and staffing. Federal lands often require qualified federal firefighters to perform a majority of the burning. These firefighters must understand the intricacies of prescribed burns—they must know the right weather conditions for a successful burn, understand the terrain, and anticipate how fire might interact with that landscape.
However, the greater part of federal funding and resources are allocated to suppression in the late summer and early fall, when fires are burning under more extreme conditions and communities are most at risk. Many federal firefighters are seasonal employees, hired primarily to battle the heat of the summer. They work from April to October or November and are then laid off until the following spring, for almost the entire window when the weather is mild and prescribed burning is most effective and safe. Permanent federal employees are capable of this work, but are small in number and often busy with administrative tasks.
The discrepancy between federal and nonfederal prescribed-fire-use trends is a direct result of a severe lack of funding—fire-prevention budgets (which include prescribed fire, thinning, and other mitigation measures) in the USFS and BLM made up no more than 25 percent of the funds provided by the federal government for suppression efforts from 2014 to 2019. Federal land agencies are growing more and more suppression-focused, while prescribed-fire initiatives continue to take the back burner on federal lands.
The Bureau of Indian Affairs is the only federal agency that has markedly increased its prescribed-fire acreage, up by 3.7 percent since 1998. Tribes have been taking greater authority over their land management and reintegrating the cultural burning practices that they historically performed. Combined with natural fires, these cultural burns did the same work as modern prescribed burns, but were snuffed out by European settlers. By the early 1900s, federal land managers had instituted full-suppression firefighting, in which any wildfire was suppressed as quickly as possible; this practice prevented fires from clearing underbrush and doing the crucial ecological work they’ve always done in western forests, resulting in landscapes that have become more overgrown, unhealthy, and flammable by the year.
Fire managers might have been able to reverse this trend back in the 1980s and ’90s—if they’d been given appropriate funding, policy changes, and resources for aggressive prescribed-burning programs. But an extra three decades of drought, trees being killed by bugs, and vegetation growth in forests has resulted in much larger and more unruly wildfires. Humans may no longer have control over how fire interacts with the American landscape. As the fire historian Stephen Pyne told me in an interview in July: “Fire by prescription assumes that you can identify the place, you can arrange it, you can manage it, and do it in advance under your terms. I don’t think we’re setting the terms anymore.”
Some fire managers have taken a more radical approach to forest management: using wildfires to help meet fire-management objectives. This strategy allows wildfires to do the large-scale forest-management work that they are wont to do, and that they’ve done for millennia.
According to data from the USFS, just 2 percent of wildfires that the service tries to suppress in their early stages become the huge conflagrations that burn uncontrollably, destroy communities and infrastructure, and end up on the news. The rest are suppressed in the so-called initial-attack stage or eventually burn themselves out and are, by and large, uneventful. They burn in non-extreme weather conditions and usually remain at the low to moderate severity that is the historical norm for many landscapes that rely on fire. Instead of spending months preparing a prescribed burn that may treat a couple thousand acres, the fire community is now accepting that wildfires themselves can be used to accomplish much more of the forest management needed to prevent large, destructive fires in the heat of fire season. The idea is simple: If the location is right, if extreme weather is not predicted, and if the appropriate resources are on hand to help should anything go awry, allowing wildfires to burn naturally—with viable fire lines established ahead of time—could be the best way to accomplish widespread forest management.
This practice is already happening in many places through what is called “wildfire management” versus wildfire suppression. These fires are allowed to burn under non-extreme conditions and in places far enough from communities and infrastructure that they pose very little risk, with firefighters on standby should they need to be managed. But this management structure is not feasible in more populated areas, because many western communities are not yet prepared to keep these types of wildfires from threatening their neighborhoods or homes.
The next step, then, is to build more fire-resilient communities through education, better protecting homes and properties against wildfires, and investing in greater fire-recovery measures. The solution to our wildfire problem will require a coordinated blend of prescribed-fire buffers in strategic areas, greater community outreach, and more low-intensity wildfires doing the work they’ve always done.
Changing the prescribed-fire paradigm will be crucial to meeting these objectives. Western states and communities will need to take a more localized approach to land management, which will mean decentralizing prescribed-fire and forest-management authority and returning some of it to local fire practitioners, tribal members, and community members.
This work is already being done through collaborative organizations such as the Fire Adapted Communities Learning Network, programs such as the TREX (Prescribed Fire Training Exchanges), and prescribed-burn associations, or PBAs, which are popping up in communities across the West. PBAs allow interested community members to learn about, prepare for, and execute prescribed burns and other fire mitigations on their own land—much as Native Americans were doing for thousands of years before European colonization, and as they’ve begun to do again.
The problem of modern wildfires has no single cause and no single solution, but praying for rain in August isn’t enough. Playing offense every fire season isn’t enough. The landscapes that westerners love and live in evolved alongside fire, and to thrive in them, we must evolve with it too. Living in fire-prone landscapes requires accountability to that land—and if we can’t fulfill those responsibilities, then we must reconsider whether we belong in those places.