This series summarizes key studies that decision-makers should consider when designing disaster and climate resiliency policies.

Reference: Zamanialaei, Maryam, Daniel San Martin, Maria Theodori, Dwi Marhaendro Jati Purnomo, Ali Tohidi, Chris Lautenberger, Yiren Qin, Arnaud Trouvé, and Michael Gollner. “Fire Risk to Structures in California’s Wildland-Urban Interface.” Nature Communications 16, no. 1 (2025): 8041.

In California and many western states, the dramatic expansion of the wildland-urban interface – the zone where human development and wildland vegetation meet and create elevated risk of wildfire – is a core resiliency challenge. In Colorado, for instance, nearly half of the state’s population (2.5 million people) now lives in the wildland-urban interface, and a million in areas with moderate-to-severe risk of wildfire.

While avoiding development altogether in the WUI is the first best option for reducing wildfire losses, the reality is that tens of millions of structures already exist there…and the numbers keep growing. The next best options are various building and landscaping practices like using fire resistant building materials and creating defensible spaces around buildings. These practices are widely recommended by wildfire experts and increasingly codified in local and state wildfire resiliency codes, and there is loads of laboratory data showing their effectiveness. There are also numerous single-fire case studies highlighting the relative importance of home hardening (the choices of materials and building techniques), defensible space (clearing of vegetation near buildings) and other factors like the spatial arrangements of structures and parcels. But how effective are these factors across larger numbers of fires and, crucially, in-combination with one another and across WUI regions?

This 2025 study from Maryam Zamanialaei and her colleagues seeks to answer these questions, drawing on data collected during and after five large WUI fires in recent California history (The Tubbs, Thomas, Camp, Kincade, and Glass fires). The researchers combine multiple sources of data to conduct their study:

The California Damage Inspection (DINS) Dataset, an existing structure loss database created and maintained by CAL FIRE.

LIDAR and visual imagery analysis to assess levels of vegetation / defensive space around individual structures.

Post-fire reconstruction modeling to estimate the local exposure to both flames and embers, relative to each structure in the DINS dataset for the fires.

Many more details in the manuscript, but the bottom line is that the authors built a dataset of ~47,000 structures sampled from across the 5 fires that includes damaged, destroyed, and undamaged buildings.

What factors were most important in predicting whether a structure was destroyed?

The most important factor was structure separation distance, the physical space separating structures.

The second most important factor was the exterior siding of the building, a reliable proxy for the building materials used to construct the building.

The third was ‘year built,’ which the authors note is a confounding variable that combines considerations of the home’s construction materials, design (for example, whether it has eaves or vents that allow for ember infiltration), and defensible space (older homes tend to have vegetation built up closer to the structure).

After that are two variables related to fire exposure – flame length and ember deposition.

The paper also analyzes the relative importance of these variables by individual fire, finding important differences but noting that structure separation distance and fire exposure were consistently at the top of the list of most meaningful variables for explaining structure survivability.

In the final portion of the analysis the authors compare a range of different algorithmic approaches to predicting structure loss in the dataset, which is an important step for moving from post-event study to pre-event scenario-construction and analysis. Put more simply, what approach is most helpful for analyzing an existing WUI area and estimating the relative value (in the numbers of structures saved) of different interventions in the built enviornment? They ultimately select a machine-learning algorithm (XGBoost Classifier) which accurately predicts 82% of cases (destroyed or survived). They then experiment with a number of scenarios where they change the assumptions about each structure in the 5 wildfires. In scenario 1 they assume that every home is hardened (i.e., built with wildfire resistant materials and techniques). In scenario 2 they combine hardening with defensible space practices. In scenario 3 they combine hardening with more extreme defensible space practices.

The results are fairly dramatic. Just combining building hardening with fairly approachable defensible space practices (Scenario 2) doubles the survivability of structures. I would be fascinated to see what effect selectively buying out structures and properties (to alter the SSD) would additionally have.

This is a terrific piece of research, largely confirming what we think we know about managing WUIs…that community-scale interventions (rather than atomized, property-level mitigation) are essential for effective risk reduction. It also highlights the value of large datasets and geospatial analysis for understanding the impacts of climate hazards in diverse built environment contexts. Lastly, it drives home the relative value of policy interventions with widely varying costs.

Hat tip to Greg Pierce and the Linkedin algorithm for recommending this one to me.

Enjoyed this post? Please consider ‘liking’ it by clicking the heart icon below. It brings new readers to Place + Resilience. And be sure to subscribe!