Close Mobile Menu

How Algorithms Could Save the Planet

February 1, 2019
by Glen Martin
Image of turtles swimming behind computer monitors Art by Leah Worthington

As the natural world unravels, conservationists are looking for new solutions to save what’s left.

Big conservation initiatives take big bucks, but there’s only so much money to go around. So, how do we allocate? And once priorities are determined, how do we identify the most effective approaches?

One possibility: Big Data. It’s now poised to do for conservation what it has done for self-driving cars and online retail, says Carl Boettiger, an assistant professor in the Department of Environmental Science, Policy, and Management at UC Berkeley.

The promise of such Spock-like objectivity is extremely appealing to conservation scientists, but Boettiger warns that algorithms are only as “objective” as the people who design them.

“We have all these rich sources for tremendous quantities of data now,” says Boettiger. “Drones, satellites, microsensors. So, wouldn’t be it great to leverage all that—not just the data, but very clever algorithms—to make decisions about conservation? After all, it’s been playing out in other fields to immense advantage.”

Researchers are already thinking about applying some of the algorithms used by autonomous vehicles to set fishing quotas, says Boettiger. Such quotas are based on estimated populations of fish; but if the figures are off, or factors like marine temperatures and prey availability aren’t accounted for, too many fish may be caught, threatening future populations.

In theory, current fishing quotas can include such uncertainty in their calculations, says Boettiger.

“But in practice it’s not possible to do so, at least not in anything more than a hand-waving manner, such as ‘conditions look worse this year so maybe we should lower the quota by 5 percent.’ New algorithms, however, make it computationally possible to take many more factors into account.”

“Basically, we’re assuming we know more than we do,” says Boettiger. “We’re assuming perfect measurements, perfect methods. But the new algorithms don’t do that. Autonomous cars assume imperfect measurements, and as a result, they can correct themselves automatically. They accommodate and adjust to error. And we can benefit from that same approach when considering fishing quotas.”

New algorithms can also help achieve complex conservation goals that involve multiple stakeholders, says Boettiger, citing marine protected areas (MPAs) as an example. MPAs are coastal zones that are off-limits to fishing. The sanctuaries are designed to allow fish to reproduce undisturbed in the hope they will ultimately repopulate adjacent areas. Legislation authorizing MPAs was passed by the California Legislature in 1999, and numerous areas along the state’s coast, including the Channel Islands, have since been closed to fishing—but not without controversy. Determining MPA boundaries and the species targeted for protection are hotly disputed issues. Algorithms can help avoid some of the squabbling, says Boettiger.

It’s highly unlikely the new algorithms will wholly exclude humans from conservation efforts—if for no other reason than policy-making is a political and social process.

“They allow you to utilize all available data and target broad objectives, balancing everything from economic values to ecological resilience, to different stakeholder needs,” says Boettiger, who is only half-joking when he suggests the Endangered Species Act might ultimately be replaced with an Endangered Species Algorithm Act.

The promise of such Spock-like objectivity is extremely appealing to conservation scientists, but Boettiger warns that algorithms are only as “objective” as the people who design them. As Amazon CEO Jeff Bezos has warned, the internet is a “confirmation bias machine” that can cause at least as many problems as it solves. And Bezos should know: Amazon has been pilloried for algorithms that reflect biases against women and people of color. Care must be taken to ensure conservation-oriented algorithms aren’t similarly tainted, and that means employing a process that is foundational to good science: peer review.

“If you put a black box algorithm out there, one that no one other than the designer has picked apart, it’s going to be suspect, people will wonder if it’s been cooked,” says Boettiger. “That’s why transparency is critical. Everyone has to be able to see and understand the process that went into the algorithm, and researchers need to be able to chip away at it and refine it.”

Historically, researchers have come up with conservation plans in what Boettiger calls a “piecemeal fashion,” using one algorithm to factor in science objectives, another for economic objectives, and so on. But new algorithms may be powerful enough to integrate everything at once.

“We may now be on the cusp of being able to use algorithms to navigate multiple conservation objectives and constraints simultaneously,” says Boettiger.

Still, it’s highly unlikely the new algorithms will wholly exclude humans from conservation efforts—if for no other reason than policy-making is a political and social process. Regardless of sophistication, algorithms can’t negotiate, jawbone, cajole or compromise. Only people can do that.

“Algorithms may be able to tell you the best places to protect in order to preserve such-and-such a species, but they won’t be able to tell you what to do if those places are also rich in uranium and the owners want to mine them,” says Tom Scott, a Cal cooperative extension specialist concerned with wildlife conservation. “Algorithms may be superficially objective, but in [some] places development pressures are extreme, and you often have to deal directly with the guys who own land that you want. And algorithms can’t help you with that.”

Share this article