The Forest Service is overstating its wildfire prevention progress to Congress despite decades of warnings not to.
I worked on this year-long, cross-desk collaboration with investigative reporter Adiel Kaplan. Our article has been referenced by Congress twice. There is currently a bill under consideration by the U.S. House of Representatives that directly references our reporting and seeks to fix the issues we raised.
FAQs
What was this story about?
Over two decades, leading federal oversight agencies have repeatedly criticized how the Forest Service calculates its progress in eliminating the trees and brush that fuel dangerous fires — one of the key strategies for combating the wildfire crisis — calling its annual reporting of acres treated to reduce risk “misleading” and “inaccurate,” and recommending changes.
A nearly year-long investigation, a cross-desk effort that required review of over 2,000 pages of dense technical reports and a complex analysis of Forest Service data, revealed that the Forest Service has not only failed to comply, but that the problem is even larger than oversight agencies thought.
The investigation revealed, for the first time, that the measure oversight agencies repeatedly warned the agency it needed to pivot away from remains its main metric and contributes to a system that experts say has long incentivized not the most effective and important risk-reduction work, but the cheapest.
The impact our first of a kind analysis found is enormous. Throughout the country, the Forest Service has counted many of the same pieces of land toward its risk-reduction goals from two to six times, and, in a few cases, dozens of times. The agency has reported that it reduced “hazardous fuel” on roughly 40 million acres of land in the past 15 years, but that figure may be overstated by an estimated 21% nationally, according to our analysis of public Forest Service records. In California, it is overstated by approximately 30%.
The inflated figures provided to Congress deprive those making funding decisions for how to combat the growing wildfire crisis of knowing the true scope of the challenge, according to experts, and stall meaningful progress on wildfire protection for at-risk communities.
Did the investigation effect change?
Yes. The story has been referenced by Congress twice. In March 2023, Representative Tom Tiffany introduced H.R. 1567, the Accurately Counting Risk Elimination Solutions (ACRES) Act, which directly links to our reporting and seeks to fix the issues we raised.
In addition, shortly after the story ran, the chair and ranking member of the Senate Natural Resources Committee – Senators Joe Manchin (D) and John Barrasso (R) – introduced S4909, the Promoting Effective Forest Management Act of 2022, which included a “Transparency in Fire Mitigation Reporting” section, which began with:
“IN GENERAL.—The Secretary concerned shall not include in any appropriations request submitted to the President for purposes of preparing the budget of the United States Government under section 1105 of title 31, United States Code, or any annual performance report submitted to Congress any output measures for acres of land on which hazardous fuels treatments were conducted if the land needs to be treated more than once—”
In a committee hearing the next month, Barrasso read an excerpt of the NBC News article and stated, “I think anybody that has read this is very troubled by this report,” before asking Forest Service Associate Deputy Chief John Crockett questions based on the findings of the article. Barrasso then entered the story into the Congressional Record.
What was my role on this project?
While my co-reporter on the investigative team, Adiel Kaplan, was sleuthing through thousands of pages of dense government documents, I was writing thousands of lines of Python code to analyze a behemoth database used by the US Forest Service to track work completed. By leveraging our strengths, Kaplan and I were able to weave together three main thrusts of reporting: tracing a decades-long history of policy and implementation to answer what the Forest Service knew about the problems with its data and when, complex data analysis to answer whether and how much the agency had continued to overstate in recent years, and a combination of both to illustrate how this impacts everyday people.
Because of extensive human-error entering values into the USFS database, every step of my analysis required creative problem solving. Among other things, I coded a bespoke Python/QGIS pipeline that sorted through thousands of GIS shapes to populate a database of topographic matches and developed code to simulate missing Congressional data.
Kaplan and I collaborated to craft, write and edit the final piece. The majority of my writing focus was on the data sections.