Besides potentially changing a plant community’s composition, stand density, and other ecological characteristics, suppressing lightning-ignited fires in wilderness may be taken to degrade its “untrammeled” or “self-willed” quality. This includes altering the self-regulating nature of fire regimes: studies from multiple large wildernesses—for example, the Selway-Bitterroot, the Frank Church-River of No Return, the Gila-Aldo Leopold Wilderness Complex, and wilderness acreage of Yosemite and Kings Canyon-Sequoia national parks—suggest that previous burns may reduce severity of subsequent wildfires in the same area, geographically constrain them by providing natural fuel breaks, and in some cases potentially discourage ignition due to lack of available fuel. “Considering that a fire will inevitably burn most forested areas at some point in the future,” the authors of one of these analyses wrote, “land managers need to weigh the short-term ‘costs’ associated with letting a fire burn with the long-term consequences of suppressing a fire”.
Much research remains to be done to clarify the influence of anthropogenic fire suppression on the fire regime of particular ecosystems, and it’s impossible to neatly separate the effects of fire suppression from the effects of climate change. Studies suggest a more nuanced picture than the general narrative of increased high-severity fires in forests overgrown because of the cessation of frequent, lower-severity burns.
For example, an assessment of area burned by high-severity fires in the Selway-Bitterroot Wilderness in the Idaho-Montana Rockies found no overall increase between 1800 and 2012 . The extent of high-severity burns was much greater in the early (1880-1934) and late (1975-2012) periods than in the middle (1935-1974), when intense fire-suppression efforts coincided with cooler and moister conditions conducive to them. The authors of that study suggested the legacy of that fire suppression combined with a warmer climate may explain the increased area affected by high-severity fires since the mid-20th century, but that the “footprint” of the significant high-severity burns of the early period—comparable in extent to those of the late period—may have limited later fires, as comparatively little high-severity acreage reburned.
A 2016 paper used fire exclusion in ponderosa-pine and mixed-conifer forests in the American West as a case study on assumptions made when evaluating whether or not to intervene ecologically in wilderness areas. The authors found many examples of studies failing to support prevailing assumptions about the effects of fire exclusion in such ecosystems, such as pronounced increases in stand density, physiological stress, and high-severity fires. The analysis doesn't suggest such effects haven’t occurred or aren't occurring in some regions, rather that a wide range of variability and likely many interacting phenomena are at play with regard to these ecological dynamics, and careful, site-specific testing is necessary before drawing conclusions.
Despite running counter to scientific research and management goals to preserve both the natural and untrammeled qualities of wilderness—and, despite highly regulated “let it burn” policies of one kind or another instituted by the National Park Service and Forest Service in numerous western public lands beginning in the late 1960s and early 1970s—fire suppression has been the dominant fire management strategy in wilderness. As of 1998, only 15% of wilderness areas—excluding Alaska, where fire use is more widespread—had approved fire plans that allow some natural ignitions to burn and even these areas continue to suppress many natural ignitions. Parsons also reported that between 1988 and 1998, even in the Bob Marshall Wilderness—the fifth-largest wilderness in the lower 48 states and which has a progressive natural fire program—the average number of natural ignitions permitted to burn dropped by over half, the average size of natural fires decreased by 75%, and only 19% of all eligible lightning fire starts were permitted to burn.
Although federal fire policy and some wilderness fire plans support allowing lightning-caused fires to burn, the challenges limiting widespread restoration and maintenance of natural fire regimes in wilderness are complex. In 1988, extensive fires, both lightning- and human-ignited, burned throughout the western United States, including 36% of Yellowstone National Park, and costing $120 million—the most expensive fire season up to that time. "Yellowstone is a beloved icon," Scott McMillion, a reporter for Montana's Bozeman Daily Chronicle at the time of the fires, recounted 20 years later. "It was a celebrity fire in a celebrity place. Everyone knows about Yellowstone and there was an impression that it was being allowed to burn to the ground".
As a result of the 1988 fires, all wilderness fire programs were suspended pending a national review of fire policy. Two years later, a public opinion survey revealed that 55% of Montana/Wyoming respondents and 48% of respondents living elsewhere supported a controlled burn policy, yet many respondents were illiterate about wildfire and its effects. Decades later, research suggests that fire use stigmas still exist. Although a 2004 study in Montana's Bob Marshall, Great Bear and Scapegoat wildernesses indicated that two-thirds of visitors felt natural ignitions in wilderness were desirable, a comparable assessment in the same area a year earlier, during a high fire season, showed that only 49% of visitors felt natural ignitions in wilderness were desirable.
Other reasons besides public sentiment may also perpetuate fire suppression. The small size of many wilderness areas results in natural ignitions outside of wilderness being suppressed before they can burn into wilderness, for example. With increases in settlement within the wildland urban interface, the risks of fire escaping onto adjacent lands, unnaturally intense fires burning as a result of unnatural fuel loads, and unacceptable smoke impacts to surrounding areas are also very real.
In some cases, management-ignited fire may be an alternative to natural fire. However, while management-ignited fire may increase naturalness, it does so at the expense of wildness because it involves human manipulation of the wilderness landscape. As such, management-ignited fire is not always desirable. The same 2004 study researching Montana wilderness visitor opinions about natural fire found, for example, that respondents were split over the acceptability of management-ignited fire, with slightly over half indicating they would accept management-ignited prescribed fires in wilderness. Management-ignited fire is also not always feasible. Many of the same problems facing the reintroduction of natural fire also prevent management-ignited fire use, including the small size of many wildernesses, their proximity to urban areas, air quality concerns, and excess fuel buildup from past suppression efforts.
Fire science continues to develop with more available sources of better historical and real-time data powering increasingly fine-tuned models. Wilderness will only grow more valuable as a benchmark of fire ecology. “The best way to learn about fire,” Carol Miller of the Aldo Leopold Wilderness Research Institute wrote in a 2014 article in The International Journal of Wilderness, “is to observe fires burning in the natural environment under a diversity of conditions and then to observe and evaluate their effects over time.” She concluded:
The past 50 years have shown that the decision to allow a fire to burn has always been a difficult one to make. As environmental and social trends complicate the context for wilderness fire management over the next 50 years, this decision will only get more difficult. The future of wilderness fire management programs may now depend on adding to the existing knowledge with research, as well as an unwavering commitment by individuals to managing this keystone natural process.