- In this photo taken on Thursday, June 23, 2005, Guhonda, a male adult mountain gorilla, in the Volcanos National Park in Rwanda.Gorillas in central Africa are in danger from illegal logging, mining and from hunters who are killing great apes for meat, a joint report from the U.N. and Interpol released Wednesday said. A previous report in 2002 estimated that only 10 percent of gorillas would remain by 2030. The author of the 2002 report and of the newly released one said that estimate now appears too optimistic. “We fear now that the gorillas may become extinct from most parts of their range in perhaps 15 years,” said the U.N. Environmental Program’s Christian Nellemann, the editor-in-chief of the newly released report “The Last Stand of The Gorilla.” (AP Photo/Riccardo Gangale)
The pace at which plants and animals are vanishing from the planet as their habitats shrink may be overstated by as much as 160 percent or more, according to new research.
An approach widely used to estimate extinctions from habitat loss is conceptually flawed, says a study, set for publication in the May 19 issue of the journal Nature. The researchers involved say that their new method more accurately reflects the interplay of shrinking habitats and the populations that rely on them.
The research is one of at least two new studies highlighting scientists’ efforts to sharpen the tools needed to track the scope of the species-extinction problem – and to design approaches for dealing with it.
Another challenges the theory that there is a ‘magic number’ of organisms, below which the species may not be able to recover.
In both cases, the issues are more than academic, specialists say. Projections of the impact of shrinking habitat on extinctions can feed into local wildlife management decisions on how much land to conserve. They also affect long-term projections of the extinctions traceable to global warming and its effects on habitats worldwide.
Additionally, the use of a single benchmark for a population’s viability may prevent conservation managers from nurturing species whose numbers fall below that threshold but still may be recoverable, explains Steven Beissinger, an ecologist at the University of California at Berkeley who took part in the study.
The development of a new tool for estimating extinctions “is welcome news, in the sense that we have bought a little time for saving species,” says Stephen Hubbell, an ecologist at the University of California at Los Angeles and one of two scientists who performed the analysis.
“But it’s unwelcome news,” he adds, “because we have to redo a whole bunch of research” performed with the previous method.
New model a better match to real ecosystems?
Many ecologists have realized for years that the technique they relied on for estimating extinctions could overestimate the losses. When scientists visited shrinking habitats, they found that the endangered plants and animals were not going extinct as fast as the estimations suggested they should be.
“Nobody could figure out why,” says Fangliang He, the Nature paper’s lead author and a researcher who specializes in modeling biodiversity and landscape processes at the University of Alberta in Edmonton.
Some suggested that the gap may be due to what became known as “extinction debt,” where endangered plants or animals may linger after losing a significant portion of their habitat, but whose eventual disappearance has become inevitable.
Now, Drs. He and Hubbell say they have found the reason for the gap and offer up a method that more closely matches extinction rates seen in the field. While the new approach doesn’t rule out the possible existence of an extinction debt, the duo says their method doesn’t need it to explain the gap.
Mass extinctions then and now
The world still faces a significant extinction problem, the two emphasize. They cite a study published in March, for instance, that compares extinction rates written in the fossil record with those recorded by the International Union for the Conservation of Nature, the world’s scorekeeper on threatened, endangered, and extinct species today.
The analysis, by a team led by Anthony Barnosky of the University of California at Berkeley, concluded that species extinctions over the past few thousand years are higher than the typical rate seen in the fossil record.
If currently threatened species become extinct within the next 100 years and the pace continues unchecked, within 240 to 540 years extinctions will rise to a level not seen since the last five major mass extinctions in Earth’s history, Barnosky found.
The major drivers in the five mass extinctions were prolonged volcanic eruptions, changing climate, or even comet or asteroid collisions with Earth. The main driver today is widely seen as the impact of human activities on climate, landscape, and oceans.
The previous approach to modeling extinction
He’s and Hubbell’s work tries to get at an important aspect influencing the pace of extinction.
At the heart of the discussion is the the “species-area relationship,” a long-established relation between the amount of area scientists look at and the number of new species they discover as they expand the patch of land or ocean they explore.
As the search area expands, Hubbell explains, scientists add a new species when they spot even one specimen of a previously unknown, undescribed, or reclassified organism.
This has led to the development of a mathematical relationship between the amount of new territory explored and the number of new species one could expect to find.
But nothing similar existed for extinctions. So researchers figured that one approach might be, in essence, to throw this “discovery” calculation into reverse to see how many species one would lose as habitat shrank.
That’s where the conceptual problem arises, He and Hubbell say. Extinction is a different beast from the discovery approach to ecology.
Instead of relying on the presence or absence of one specimen, “the extinction problem requires that all of the individuals be lost and be included in the area of habitat loss,” Hubbell explains.
Figuring out this distinction was relatively easy, the pair acknowledges. But it took eight years for the two scientists to work out the math behind the approach and use it to demonstrate why and how the previous approach to estimating extinctions so often failed to match field observations.
Testing the new model
Once they had a new mathematical approach in hand, they used old and new methods to calculate extinction rates for eight well-studied small forest plots in China, Panama, Ecuador, Malaysia, Cameroon, as well as two far broader regions in the US that are home to some 279 species of perching birds. The new, more-nuanced approach gave significantly lower extinction rates from lost habitat than the old.
“This is a really cool analysis,” says Peter Kareva, chief scientist for the Nature Conservancy in Arlington, Va. For an ecologist, the distinction He and Hubbell draw between the old and proposed new approaches “is kind of obvious. I wonder how this has been overlooked” for so long, he says.
Others are touch more cautious.
Hubbell has long been known in the field for “developing an intellectual framework that allows people to test ecological and conservation ideas,” says Richard Primack, a conservation biologist at Boston University.
This new approach to estimating extinction rates “is taking ideas that were generally out there and providing a rigorous model which hundreds of people will start testing over the next few years,” he says.
No more ‘magic number’?
Another team has looked at the extinction question from a different perspective. Their research, accepted for publication in the journal Trends in Ecology and Evolution, suggests that using a single, universal “minimum viable” population number – to gauge which endangered plants or animals to try to save – is seriously flawed.
It fails to account for important changes such as climate, loss of habitat, or encroachment by invasive species, or whether the population is growing or decling – all of which can play key roles in a species’ survival.
Dr. Beissinger and colleagues felt the use of a “magic number” for identifying a viable minimum population of organisms “was a bit of an oversimplification,” he says.
Among other problems, the single benchmark can lead policymakers to declare a species saved when it may be anything but, he adds.
Some conservation specialists have used the approach as a tool for triage. “Should we continue to try to save the California condor, when there are 20 rare Hawaiian plants we could probably save with the same budget?” Beissinger asks rhetorically.