Flames burn near power lines in Sycamore Canyon in Montecito in 2017. Utilities are turning to computer modeling to predict where their equipment poses the greatest fire threat. <span class="copyright">(Mike Eliason / Associated Press)</span>
Flames burn near power lines in Sycamore Canyon in Montecito in 2017. Utilities are turning to computer modeling to predict where their equipment poses the greatest fire threat. (Mike Eliason / Associated Press)

When freak lightning storms passed over Northern California’s wine country last month and sparked hundreds of wildfires, a newly established network of remote weather stations, orbiting satellites and supercomputers spun into action and attempted to predict the spread of what is now known as the LNU Lightning Complex fire.

Firefighters and technologists have long dreamed of a formula or device that would accurately predict the spread of fire, much the way meteorologists predict the possible impact of extreme weather, but it’s only recently that big data and supercomputers have begun to show promise as a means of fire forecasting.

“I think a firefighter starting out today in his or her career, they’re going to see something to the point where they leave the [station] on the fire, they’ll have a simulation on their screen of where the fire is going to go, where they need to do evacuations,” said Tim Chavez, a fire behavior analyst with Cal Fire since 2000.

Past forecasts relied on huge assumptions about the landscape and upcoming weather, but today’s forecasts are based on a web of remote weather stations, cameras and satellites merged with ground-level details on vegetation and moisture. Now California firefighters and the state’s largest power utilities are hoping these networks will help them to better plan evacuations and more precisely target power shutoffs in times of emergency.

The technology Cal Fire uses, created by La Jolla-based Technosylva, was brought into the department in July under a three-year, $8.8-million contract and has yet to be fully rolled out across the agency, department spokeswoman Christine McMorrow said. But the program has already been used by a handful of Cal Fire analysts who ran simulations of where the flames were expected to be eight hours later.

“We did one for the LNU Complex and it did show a rapid rate of spread,” McMorrow said, referring to what is now, at well over 360,000 acres burned, the fourth largest fire in state record books. “They are pleased with what they’re getting from it.”

The state’s big three electric utilities are also using the technology.

In August, Edison said it ran simulations of potential fires before shutting off power to circuits in Los Angeles and Kern counties. A few weeks later, PG&E ran simulations of where the LNU Complex fire was headed before they decided to spray some 7,000 power poles with retardant.

When wind events are in the forecast, Edison, PG&E and San Diego Gas & Electric said their preemptive power shutdowns should affect about 30% fewer people than they did last year, in part due to a better grasp of where the fire threats are greatest.

Facing serious liability under California’s inverse-condemnation laws, utilities shut off sections of their grid on hot, windy days, when the equipment is most at risk of sparking a wildfire. Last year, such power shutoffs led to millions of Californians going for days without electricity.

“If the fuels data is good, if the weather data is good and the location is correct, our models provide a good ballpark,” said Technosylva President Joaquin Ramirez. “It’s a young science, but we’re on the right track.”

Difficulties remain in accurately predicting extreme fire behavior, however.

When the federally managed North Complex fire jumped a river and sped into Berry Creek on Sept. 8, killing more than a dozen people, “the spot fire moved 20 miles beyond all models identified,” the fire’s incident commander, Jay Kurth, wrote in a public letter.

Similarly, when SDG&E tried to re-create simulations of large fires they experienced in 2003 and 2007, Technosylva’s models were less extreme than what actually happened. While the Technosylva software uses data more refined than its competition, experts say the fundamental science behind predicting what a fire will do hasn’t changed, more or less, in half a century.

“There’s really only one model that’s used for fire spread models — it’s the Rothermel model,” said Chris Lautenberger, co-founder of fire spread modeling company Reax Engineering, which also holds a contract with PG&E. “Technosylva uses that, our model uses that. So what differs from model to model is more the assumptions and approximations that are made.”

The Rothermel model is a mathematical equation established in 1972 by a former General Electric engineer to explain the rate of a fire’s spread. It models ground fires in light brush and grass, and has become the foundation upon which most fire predictive models — from crown fires to fire spotting — were built.

“My model has lasted through 50 years because it could do the work,” Richard Rothermel, 90, told The Times in a recent interview from his Montana home. “Now, the problem is people expected it to do far more than it was designed to do.”

With that in mind, officials with all three utilities said that while they’re using fire spread modeling to inform their power shutoffs, it’s not the deciding factor.

“If you’re looking for a dead-on representation of the footprint of that fire, it’s going to be off,” said Edison’s fire scientist, Tom Rolinski. “It’s a model, and all models are wrong. We just don’t know where they’re wrong.”