Dean Smith watches as the Yarnell Hill Fire encroaches on his home in Glenn Ilah on 30 June 2013 near Yarnell, Arizona. / David Kadlubowski/ AP

By Julie Cart
3 July 2013 BOISE, Idaho (Los Angeles Times) – Early morning is a frenetic time at a wildfire command post. Biologists, meteorologists, foresters and firefighters hustle into tents and grab laptops to review overnight reports, prepping for the day’s assault. Fire behavior analysts run computer models that spit out information crucial to putting out the blaze: how many acres a fire will probably burn, in which direction and with what intensity. In recent years, the models have been rendered practically obsolete, unable to project how erratic Western fires have become, making tactical decisions more difficult for fire bosses and the fire lines less safe for crews in the field. The analytical work performed by fire scientists here at the National Interagency Fire Center also confirms what seems anecdotally evident: Wildfires are getting bigger — the average fire is now five times as large as it was in the 1980s — and these enormous conflagrations have a breathtaking facility to dance and grow. Unforeseen winds are swerving and turning on fire crews, and it’s no longer unusual for fires to double in size in a day. The unpredictable has now become the expected. Last month at Rocky Mountain National Park, the Big Meadows fire amazed veteran firefighters by burning across snowfields. The fire wasn’t carried by embers, but marched inexplicably over deep snow. “No one had seen that before,” said Dick Bahr, the National Park Service’s fire science lead, still a bit in awe of the power of such a fire. “It’s a bit crazy. The models are only as good as the data behind them, and the data is changing faster than we can update it.” Unexpected fire behavior — perhaps a sudden wind shift — may have been at work in the Yarnell Hill fire, the out-of-control blaze that killed 19 elite wildland firefighters in Arizona on Sunday. Exactly what caused the men to become trapped and overrun by flames is unknown, but the case suggests that fire managers are struggling to find secure positions on fire lines. U.S. Forest Service Chief Tom Tidwell laid it out in stark terms last month while testifying before Congress: “The last two decades have seen fires that are extraordinary in their size, intensity, and impacts.” Consider the example of three New Mexico fires: In 2000, the Cerro Grande fire near Los Alamos became the most destructive fire in state history. It burned 40,000 acres in a week. In 2011, the Las Conchas fire burned through 61,000 acres of a forest in one day and eventually charred 150,000 acres. It was the state’s most destructive fire until last year, when the Whitewater-Baldy Complex fire burned nearly 300,000 acres. “These huge fires are the new normal,” said John Glenn, chief of fire operations for the federal Bureau of Land Management. “Look at any touchstone — global warming, fuels, invasive species, forest and rangeland health issues — and then you throw in the urban interface. It’s almost like this perfect mix. What used to be the anomaly is almost like the normal now.” A multitude of factors contribute to making wildland fires more complicated to fight and more difficult to understand: drought, insect-ravaged trees, fewer resources set aside to thin overgrown forests. But the physics of fire remain immutable: a voracious force constantly seeking fuel to consume. [more]

As Arizona fire rages, scientists warn of more unpredictable blazes